[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 28011 1726882530.05161: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-spT executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 28011 1726882530.06428: Added group all to inventory 28011 1726882530.06430: Added group ungrouped to inventory 28011 1726882530.06434: Group all now contains ungrouped 28011 1726882530.06437: Examining possible inventory source: /tmp/network-Kc3/inventory.yml 28011 1726882530.25740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 28011 1726882530.25802: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 28011 1726882530.25824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 28011 1726882530.25883: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 28011 1726882530.25956: Loaded config def from plugin (inventory/script) 28011 1726882530.25958: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 28011 1726882530.26000: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 28011 1726882530.26087: Loaded config def from plugin (inventory/yaml) 28011 1726882530.26089: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 28011 1726882530.26176: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 28011 1726882530.26590: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 28011 1726882530.26595: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 28011 1726882530.26598: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 28011 1726882530.26604: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 28011 1726882530.26608: Loading data from /tmp/network-Kc3/inventory.yml 28011 1726882530.26675: /tmp/network-Kc3/inventory.yml was not parsable by auto 28011 1726882530.26740: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 28011 1726882530.26779: Loading data from /tmp/network-Kc3/inventory.yml 28011 1726882530.26864: group all already in inventory 28011 1726882530.26870: set inventory_file for managed_node1 28011 1726882530.26873: set inventory_dir for managed_node1 28011 1726882530.26873: Added host managed_node1 to inventory 28011 1726882530.26876: Added host managed_node1 to group all 28011 1726882530.26878: set ansible_host for managed_node1 28011 1726882530.26879: set ansible_ssh_extra_args for managed_node1 28011 1726882530.26885: set inventory_file for managed_node2 28011 1726882530.26889: set inventory_dir for managed_node2 28011 1726882530.26890: Added host managed_node2 to inventory 28011 1726882530.26891: Added host managed_node2 to group all 28011 1726882530.26892: set ansible_host for managed_node2 28011 1726882530.26894: set ansible_ssh_extra_args for managed_node2 28011 1726882530.26900: set inventory_file for managed_node3 28011 1726882530.26903: set inventory_dir for managed_node3 28011 1726882530.26904: Added host managed_node3 to inventory 28011 1726882530.26906: Added host managed_node3 to group all 28011 1726882530.26907: set ansible_host for managed_node3 28011 1726882530.26907: set ansible_ssh_extra_args for managed_node3 28011 1726882530.26910: Reconcile groups and hosts in inventory. 28011 1726882530.26913: Group ungrouped now contains managed_node1 28011 1726882530.26915: Group ungrouped now contains managed_node2 28011 1726882530.26917: Group ungrouped now contains managed_node3 28011 1726882530.27000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 28011 1726882530.27124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 28011 1726882530.27170: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 28011 1726882530.27199: Loaded config def from plugin (vars/host_group_vars) 28011 1726882530.27202: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 28011 1726882530.27209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 28011 1726882530.27217: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 28011 1726882530.27259: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 28011 1726882530.27583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882530.27675: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 28011 1726882530.27715: Loaded config def from plugin (connection/local) 28011 1726882530.27719: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 28011 1726882530.28217: Loaded config def from plugin (connection/paramiko_ssh) 28011 1726882530.28220: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 28011 1726882530.29076: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28011 1726882530.29113: Loaded config def from plugin (connection/psrp) 28011 1726882530.29115: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 28011 1726882530.29809: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28011 1726882530.29847: Loaded config def from plugin (connection/ssh) 28011 1726882530.29850: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 28011 1726882530.31271: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28011 1726882530.31298: Loaded config def from plugin (connection/winrm) 28011 1726882530.31300: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 28011 1726882530.31320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 28011 1726882530.31361: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 28011 1726882530.31407: Loaded config def from plugin (shell/cmd) 28011 1726882530.31408: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 28011 1726882530.31425: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 28011 1726882530.31460: Loaded config def from plugin (shell/powershell) 28011 1726882530.31461: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 28011 1726882530.31502: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 28011 1726882530.31606: Loaded config def from plugin (shell/sh) 28011 1726882530.31608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 28011 1726882530.31630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 28011 1726882530.31704: Loaded config def from plugin (become/runas) 28011 1726882530.31707: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 28011 1726882530.31817: Loaded config def from plugin (become/su) 28011 1726882530.31819: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 28011 1726882530.31912: Loaded config def from plugin (become/sudo) 28011 1726882530.31914: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 28011 1726882530.31951: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml 28011 1726882530.32278: in VariableManager get_vars() 28011 1726882530.32306: done with get_vars() 28011 1726882530.32430: trying /usr/local/lib/python3.12/site-packages/ansible/modules 28011 1726882530.34486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 28011 1726882530.34556: in VariableManager get_vars() 28011 1726882530.34559: done with get_vars() 28011 1726882530.34561: variable 'playbook_dir' from source: magic vars 28011 1726882530.34561: variable 'ansible_playbook_python' from source: magic vars 28011 1726882530.34562: variable 'ansible_config_file' from source: magic vars 28011 1726882530.34562: variable 'groups' from source: magic vars 28011 1726882530.34563: variable 'omit' from source: magic vars 28011 1726882530.34563: variable 'ansible_version' from source: magic vars 28011 1726882530.34564: variable 'ansible_check_mode' from source: magic vars 28011 1726882530.34564: variable 'ansible_diff_mode' from source: magic vars 28011 1726882530.34564: variable 'ansible_forks' from source: magic vars 28011 1726882530.34565: variable 'ansible_inventory_sources' from source: magic vars 28011 1726882530.34565: variable 'ansible_skip_tags' from source: magic vars 28011 1726882530.34566: variable 'ansible_limit' from source: magic vars 28011 1726882530.34566: variable 'ansible_run_tags' from source: magic vars 28011 1726882530.34567: variable 'ansible_verbosity' from source: magic vars 28011 1726882530.34587: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml 28011 1726882530.35022: in VariableManager get_vars() 28011 1726882530.35031: done with get_vars() 28011 1726882530.35056: in VariableManager get_vars() 28011 1726882530.35064: done with get_vars() 28011 1726882530.35084: in VariableManager get_vars() 28011 1726882530.35095: done with get_vars() 28011 1726882530.35121: in VariableManager get_vars() 28011 1726882530.35129: done with get_vars() 28011 1726882530.35132: variable 'omit' from source: magic vars 28011 1726882530.35143: variable 'omit' from source: magic vars 28011 1726882530.35165: in VariableManager get_vars() 28011 1726882530.35172: done with get_vars() 28011 1726882530.35205: in VariableManager get_vars() 28011 1726882530.35214: done with get_vars() 28011 1726882530.35237: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28011 1726882530.35363: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28011 1726882530.35443: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28011 1726882530.35813: in VariableManager get_vars() 28011 1726882530.35825: done with get_vars() 28011 1726882530.36107: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 28011 1726882530.36188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28011 1726882530.37481: in VariableManager get_vars() 28011 1726882530.37496: done with get_vars() 28011 1726882530.37499: variable 'omit' from source: magic vars 28011 1726882530.37505: variable 'omit' from source: magic vars 28011 1726882530.37523: in VariableManager get_vars() 28011 1726882530.37530: done with get_vars() 28011 1726882530.37542: in VariableManager get_vars() 28011 1726882530.37553: done with get_vars() 28011 1726882530.37570: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28011 1726882530.37633: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28011 1726882530.37677: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28011 1726882530.38981: in VariableManager get_vars() 28011 1726882530.38998: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28011 1726882530.40340: in VariableManager get_vars() 28011 1726882530.40342: done with get_vars() 28011 1726882530.40344: variable 'playbook_dir' from source: magic vars 28011 1726882530.40344: variable 'ansible_playbook_python' from source: magic vars 28011 1726882530.40345: variable 'ansible_config_file' from source: magic vars 28011 1726882530.40345: variable 'groups' from source: magic vars 28011 1726882530.40346: variable 'omit' from source: magic vars 28011 1726882530.40346: variable 'ansible_version' from source: magic vars 28011 1726882530.40347: variable 'ansible_check_mode' from source: magic vars 28011 1726882530.40347: variable 'ansible_diff_mode' from source: magic vars 28011 1726882530.40348: variable 'ansible_forks' from source: magic vars 28011 1726882530.40348: variable 'ansible_inventory_sources' from source: magic vars 28011 1726882530.40348: variable 'ansible_skip_tags' from source: magic vars 28011 1726882530.40349: variable 'ansible_limit' from source: magic vars 28011 1726882530.40349: variable 'ansible_run_tags' from source: magic vars 28011 1726882530.40350: variable 'ansible_verbosity' from source: magic vars 28011 1726882530.40369: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 28011 1726882530.40421: in VariableManager get_vars() 28011 1726882530.40424: done with get_vars() 28011 1726882530.40425: variable 'playbook_dir' from source: magic vars 28011 1726882530.40426: variable 'ansible_playbook_python' from source: magic vars 28011 1726882530.40426: variable 'ansible_config_file' from source: magic vars 28011 1726882530.40427: variable 'groups' from source: magic vars 28011 1726882530.40427: variable 'omit' from source: magic vars 28011 1726882530.40428: variable 'ansible_version' from source: magic vars 28011 1726882530.40428: variable 'ansible_check_mode' from source: magic vars 28011 1726882530.40428: variable 'ansible_diff_mode' from source: magic vars 28011 1726882530.40429: variable 'ansible_forks' from source: magic vars 28011 1726882530.40429: variable 'ansible_inventory_sources' from source: magic vars 28011 1726882530.40433: variable 'ansible_skip_tags' from source: magic vars 28011 1726882530.40434: variable 'ansible_limit' from source: magic vars 28011 1726882530.40434: variable 'ansible_run_tags' from source: magic vars 28011 1726882530.40435: variable 'ansible_verbosity' from source: magic vars 28011 1726882530.40454: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 28011 1726882530.40512: in VariableManager get_vars() 28011 1726882530.40520: done with get_vars() 28011 1726882530.40545: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28011 1726882530.40610: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28011 1726882530.40652: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28011 1726882530.40875: in VariableManager get_vars() 28011 1726882530.40886: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28011 1726882530.41906: in VariableManager get_vars() 28011 1726882530.41917: done with get_vars() 28011 1726882530.41939: in VariableManager get_vars() 28011 1726882530.41940: done with get_vars() 28011 1726882530.41942: variable 'playbook_dir' from source: magic vars 28011 1726882530.41942: variable 'ansible_playbook_python' from source: magic vars 28011 1726882530.41943: variable 'ansible_config_file' from source: magic vars 28011 1726882530.41943: variable 'groups' from source: magic vars 28011 1726882530.41944: variable 'omit' from source: magic vars 28011 1726882530.41944: variable 'ansible_version' from source: magic vars 28011 1726882530.41945: variable 'ansible_check_mode' from source: magic vars 28011 1726882530.41945: variable 'ansible_diff_mode' from source: magic vars 28011 1726882530.41945: variable 'ansible_forks' from source: magic vars 28011 1726882530.41946: variable 'ansible_inventory_sources' from source: magic vars 28011 1726882530.41946: variable 'ansible_skip_tags' from source: magic vars 28011 1726882530.41947: variable 'ansible_limit' from source: magic vars 28011 1726882530.41947: variable 'ansible_run_tags' from source: magic vars 28011 1726882530.41948: variable 'ansible_verbosity' from source: magic vars 28011 1726882530.41966: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 28011 1726882530.42013: in VariableManager get_vars() 28011 1726882530.42022: done with get_vars() 28011 1726882530.42048: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28011 1726882530.42122: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28011 1726882530.42166: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28011 1726882530.42384: in VariableManager get_vars() 28011 1726882530.42399: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28011 1726882530.43381: in VariableManager get_vars() 28011 1726882530.43389: done with get_vars() 28011 1726882530.43415: in VariableManager get_vars() 28011 1726882530.43422: done with get_vars() 28011 1726882530.43444: in VariableManager get_vars() 28011 1726882530.43451: done with get_vars() 28011 1726882530.43495: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 28011 1726882530.43514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 28011 1726882530.43674: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 28011 1726882530.43764: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 28011 1726882530.43766: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 28011 1726882530.43787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 28011 1726882530.43807: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 28011 1726882530.43907: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 28011 1726882530.43942: Loaded config def from plugin (callback/default) 28011 1726882530.43943: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 28011 1726882530.44692: Loaded config def from plugin (callback/junit) 28011 1726882530.44696: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 28011 1726882530.44726: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 28011 1726882530.44766: Loaded config def from plugin (callback/minimal) 28011 1726882530.44767: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 28011 1726882530.44797: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 28011 1726882530.44836: Loaded config def from plugin (callback/tree) 28011 1726882530.44837: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 28011 1726882530.44912: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 28011 1726882530.44913: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_route_table_nm.yml ********************************************* 6 plays in /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml 28011 1726882530.44934: in VariableManager get_vars() 28011 1726882530.44941: done with get_vars() 28011 1726882530.44945: in VariableManager get_vars() 28011 1726882530.44950: done with get_vars() 28011 1726882530.44952: variable 'omit' from source: magic vars 28011 1726882530.44975: in VariableManager get_vars() 28011 1726882530.44984: done with get_vars() 28011 1726882530.45000: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_route_table.yml' with nm as provider] ****** 28011 1726882530.45410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 28011 1726882530.45457: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 28011 1726882530.45482: getting the remaining hosts for this loop 28011 1726882530.45484: done getting the remaining hosts for this loop 28011 1726882530.45485: getting the next task for host managed_node1 28011 1726882530.45488: done getting next task for host managed_node1 28011 1726882530.45489: ^ task is: TASK: Gathering Facts 28011 1726882530.45492: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882530.45495: getting variables 28011 1726882530.45496: in VariableManager get_vars() 28011 1726882530.45502: Calling all_inventory to load vars for managed_node1 28011 1726882530.45504: Calling groups_inventory to load vars for managed_node1 28011 1726882530.45505: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882530.45514: Calling all_plugins_play to load vars for managed_node1 28011 1726882530.45522: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882530.45525: Calling groups_plugins_play to load vars for managed_node1 28011 1726882530.45546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882530.45578: done with get_vars() 28011 1726882530.45582: done getting variables 28011 1726882530.45642: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:6 Friday 20 September 2024 21:35:30 -0400 (0:00:00.007) 0:00:00.007 ****** 28011 1726882530.45656: entering _queue_task() for managed_node1/gather_facts 28011 1726882530.45657: Creating lock for gather_facts 28011 1726882530.45931: worker is 1 (out of 1 available) 28011 1726882530.45942: exiting _queue_task() for managed_node1/gather_facts 28011 1726882530.45953: done queuing things up, now waiting for results queue to drain 28011 1726882530.45955: waiting for pending results... 28011 1726882530.46092: running TaskExecutor() for managed_node1/TASK: Gathering Facts 28011 1726882530.46141: in run() - task 12673a56-9f93-962d-7c65-0000000000f5 28011 1726882530.46155: variable 'ansible_search_path' from source: unknown 28011 1726882530.46182: calling self._execute() 28011 1726882530.46229: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882530.46234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882530.46242: variable 'omit' from source: magic vars 28011 1726882530.46317: variable 'omit' from source: magic vars 28011 1726882530.46335: variable 'omit' from source: magic vars 28011 1726882530.46360: variable 'omit' from source: magic vars 28011 1726882530.46395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882530.46421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882530.46498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882530.46501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882530.46504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882530.46506: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882530.46508: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882530.46511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882530.46554: Set connection var ansible_connection to ssh 28011 1726882530.46561: Set connection var ansible_pipelining to False 28011 1726882530.46567: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882530.46572: Set connection var ansible_shell_executable to /bin/sh 28011 1726882530.46579: Set connection var ansible_timeout to 10 28011 1726882530.46583: Set connection var ansible_shell_type to sh 28011 1726882530.46606: variable 'ansible_shell_executable' from source: unknown 28011 1726882530.46609: variable 'ansible_connection' from source: unknown 28011 1726882530.46612: variable 'ansible_module_compression' from source: unknown 28011 1726882530.46614: variable 'ansible_shell_type' from source: unknown 28011 1726882530.46616: variable 'ansible_shell_executable' from source: unknown 28011 1726882530.46619: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882530.46623: variable 'ansible_pipelining' from source: unknown 28011 1726882530.46625: variable 'ansible_timeout' from source: unknown 28011 1726882530.46629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882530.46757: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882530.46768: variable 'omit' from source: magic vars 28011 1726882530.46772: starting attempt loop 28011 1726882530.46774: running the handler 28011 1726882530.46784: variable 'ansible_facts' from source: unknown 28011 1726882530.46804: _low_level_execute_command(): starting 28011 1726882530.46811: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882530.47313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882530.47317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882530.47320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882530.47373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882530.47376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882530.47378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882530.47436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882530.49110: stdout chunk (state=3): >>>/root <<< 28011 1726882530.49209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882530.49236: stderr chunk (state=3): >>><<< 28011 1726882530.49240: stdout chunk (state=3): >>><<< 28011 1726882530.49263: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882530.49270: _low_level_execute_command(): starting 28011 1726882530.49275: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420 `" && echo ansible-tmp-1726882530.492573-28055-55765411688420="` echo /root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420 `" ) && sleep 0' 28011 1726882530.49682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882530.49685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882530.49707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882530.49759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882530.49778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882530.49860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882530.51705: stdout chunk (state=3): >>>ansible-tmp-1726882530.492573-28055-55765411688420=/root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420 <<< 28011 1726882530.51811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882530.51831: stderr chunk (state=3): >>><<< 28011 1726882530.51834: stdout chunk (state=3): >>><<< 28011 1726882530.51847: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882530.492573-28055-55765411688420=/root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882530.51874: variable 'ansible_module_compression' from source: unknown 28011 1726882530.51915: ANSIBALLZ: Using generic lock for ansible.legacy.setup 28011 1726882530.51918: ANSIBALLZ: Acquiring lock 28011 1726882530.51921: ANSIBALLZ: Lock acquired: 139767565767152 28011 1726882530.51923: ANSIBALLZ: Creating module 28011 1726882530.78704: ANSIBALLZ: Writing module into payload 28011 1726882530.78789: ANSIBALLZ: Writing module 28011 1726882530.78824: ANSIBALLZ: Renaming module 28011 1726882530.78836: ANSIBALLZ: Done creating module 28011 1726882530.78877: variable 'ansible_facts' from source: unknown 28011 1726882530.78889: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882530.78905: _low_level_execute_command(): starting 28011 1726882530.78919: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 28011 1726882530.79554: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882530.79589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882530.79687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882530.79713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882530.79796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882530.81388: stdout chunk (state=3): >>>PLATFORM <<< 28011 1726882530.81465: stdout chunk (state=3): >>>Linux <<< 28011 1726882530.81513: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 <<< 28011 1726882530.81516: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 28011 1726882530.81692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882530.81697: stdout chunk (state=3): >>><<< 28011 1726882530.81699: stderr chunk (state=3): >>><<< 28011 1726882530.81847: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882530.81853 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 28011 1726882530.81857: _low_level_execute_command(): starting 28011 1726882530.81859: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 28011 1726882530.82168: Sending initial data 28011 1726882530.82171: Sent initial data (1181 bytes) 28011 1726882530.82579: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882530.82647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882530.82678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882530.82713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882530.82803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882530.86191: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 28011 1726882530.86803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882530.86807: stdout chunk (state=3): >>><<< 28011 1726882530.86810: stderr chunk (state=3): >>><<< 28011 1726882530.86812: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882530.86814: variable 'ansible_facts' from source: unknown 28011 1726882530.86817: variable 'ansible_facts' from source: unknown 28011 1726882530.86820: variable 'ansible_module_compression' from source: unknown 28011 1726882530.86822: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28011 1726882530.86824: variable 'ansible_facts' from source: unknown 28011 1726882530.87026: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420/AnsiballZ_setup.py 28011 1726882530.87171: Sending initial data 28011 1726882530.87206: Sent initial data (152 bytes) 28011 1726882530.87830: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882530.87901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882530.87950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882530.87963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882530.87981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882530.88052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882530.89560: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28011 1726882530.89577: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882530.89639: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882530.89682: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp9a6sj799 /root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420/AnsiballZ_setup.py <<< 28011 1726882530.89692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420/AnsiballZ_setup.py" <<< 28011 1726882530.89719: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp9a6sj799" to remote "/root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420/AnsiballZ_setup.py" <<< 28011 1726882530.91211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882530.91249: stderr chunk (state=3): >>><<< 28011 1726882530.91357: stdout chunk (state=3): >>><<< 28011 1726882530.91360: done transferring module to remote 28011 1726882530.91362: _low_level_execute_command(): starting 28011 1726882530.91365: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420/ /root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420/AnsiballZ_setup.py && sleep 0' 28011 1726882530.91938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882530.91971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882530.92133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882530.93863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882530.93867: stdout chunk (state=3): >>><<< 28011 1726882530.93875: stderr chunk (state=3): >>><<< 28011 1726882530.93902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882530.93910: _low_level_execute_command(): starting 28011 1726882530.93919: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420/AnsiballZ_setup.py && sleep 0' 28011 1726882530.94484: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882530.94503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882530.94520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882530.94543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882530.94559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882530.94655: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882530.94708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882530.94736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882530.97187: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 28011 1726882530.97225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57465104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57464dfb30> <<< 28011 1726882530.97319: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5746512a50> <<< 28011 1726882530.97323: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 28011 1726882530.97353: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 28011 1726882530.97458: stdout chunk (state=3): >>>import '_collections_abc' # <<< 28011 1726882530.97757: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57462e5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57462e5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28011 1726882530.98096: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 28011 1726882530.98124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 28011 1726882530.98147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882530.98162: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 28011 1726882530.98203: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 28011 1726882530.98220: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 28011 1726882530.98247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 28011 1726882530.98264: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5746323dd0> <<< 28011 1726882530.98288: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 28011 1726882530.98318: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5746323fe0> <<< 28011 1726882530.98346: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 28011 1726882530.98370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 28011 1726882530.98398: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 28011 1726882530.98450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882530.98484: stdout chunk (state=3): >>>import 'itertools' # <<< 28011 1726882530.98509: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574635b7a0> <<< 28011 1726882530.98536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574635be30> <<< 28011 1726882530.98720: stdout chunk (state=3): >>>import '_collections' # <<< 28011 1726882530.98740: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574633baa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463391c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5746320f80> <<< 28011 1726882530.98759: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 28011 1726882530.98782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 28011 1726882530.98809: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 28011 1726882530.98854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 28011 1726882530.98859: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 28011 1726882530.98910: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574637b710> <<< 28011 1726882530.98916: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574637a330> <<< 28011 1726882530.98943: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574633a090> <<< 28011 1726882530.98963: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5746378b90> <<< 28011 1726882530.99008: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 28011 1726882530.99038: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b0740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5746320200> <<< 28011 1726882530.99090: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 28011 1726882530.99120: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57463b0bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b0aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882530.99143: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57463b0e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574631ed20> <<< 28011 1726882530.99218: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 28011 1726882530.99244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 28011 1726882530.99255: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b1580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b1250> import 'importlib.machinery' # <<< 28011 1726882530.99322: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 28011 1726882530.99337: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b2480> import 'importlib.util' # import 'runpy' # <<< 28011 1726882530.99420: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 28011 1726882530.99432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463c8680> import 'errno' # <<< 28011 1726882530.99476: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882530.99480: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57463c9d60> <<< 28011 1726882530.99545: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 28011 1726882530.99548: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463cac00> <<< 28011 1726882530.99590: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57463cb260> <<< 28011 1726882530.99661: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463ca150> <<< 28011 1726882530.99664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 28011 1726882530.99736: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57463cbce0> <<< 28011 1726882530.99742: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463cb410> <<< 28011 1726882530.99760: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 28011 1726882530.99865: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 28011 1726882530.99883: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57460c7bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 28011 1726882530.99906: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882530.99926: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57460f06e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57460f0440> <<< 28011 1726882530.99945: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57460f0710> <<< 28011 1726882530.99979: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 28011 1726882530.99989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 28011 1726882531.00054: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882531.00186: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57460f1040> <<< 28011 1726882531.00301: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57460f1a30> <<< 28011 1726882531.00318: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57460f08f0> <<< 28011 1726882531.00410: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57460c5d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 28011 1726882531.00434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57460f2de0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57460f1b50> <<< 28011 1726882531.00454: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b2ba0> <<< 28011 1726882531.00475: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 28011 1726882531.00544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882531.00561: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 28011 1726882531.00596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 28011 1726882531.00627: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574611f140> <<< 28011 1726882531.00749: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 28011 1726882531.00778: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574613f500> <<< 28011 1726882531.00797: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 28011 1726882531.00843: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 28011 1726882531.01140: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57461a02c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57461a2a20> <<< 28011 1726882531.01192: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57461a03e0> <<< 28011 1726882531.01235: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57461692e0> <<< 28011 1726882531.01258: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745fa93d0> <<< 28011 1726882531.01277: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574613e300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57460f3d10> <<< 28011 1726882531.01450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 28011 1726882531.01471: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f574613e900> <<< 28011 1726882531.01725: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ywgkzq93/ansible_ansible.legacy.setup_payload.zip' <<< 28011 1726882531.01744: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.01850: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.01882: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 28011 1726882531.01928: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 28011 1726882531.01999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 28011 1726882531.02032: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574600f0e0> <<< 28011 1726882531.02044: stdout chunk (state=3): >>>import '_typing' # <<< 28011 1726882531.02224: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745fedfd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745fed160> # zipimport: zlib available <<< 28011 1726882531.02250: stdout chunk (state=3): >>>import 'ansible' # <<< 28011 1726882531.02307: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28011 1726882531.02340: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 28011 1726882531.03925: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.04813: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574600cfb0> <<< 28011 1726882531.04845: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882531.04867: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 28011 1726882531.04897: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 28011 1726882531.04976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574603e960> <<< 28011 1726882531.05004: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574603e6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574603e030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 28011 1726882531.05016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 28011 1726882531.05056: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574603e750> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574600fd70> import 'atexit' # <<< 28011 1726882531.05115: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574603f680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574603f8c0> <<< 28011 1726882531.05136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 28011 1726882531.05171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 28011 1726882531.05188: stdout chunk (state=3): >>>import '_locale' # <<< 28011 1726882531.05245: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574603fe00> <<< 28011 1726882531.05248: stdout chunk (state=3): >>>import 'pwd' # <<< 28011 1726882531.05411: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 28011 1726882531.05414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 28011 1726882531.05417: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745929bb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574592b7d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 28011 1726882531.05419: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574592c1d0> <<< 28011 1726882531.05454: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 28011 1726882531.05457: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 28011 1726882531.05482: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574592d370> <<< 28011 1726882531.05500: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 28011 1726882531.05520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 28011 1726882531.05591: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 28011 1726882531.05596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 28011 1726882531.05605: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574592fe60> <<< 28011 1726882531.05634: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57463cab70> <<< 28011 1726882531.05668: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574592e120> <<< 28011 1726882531.05671: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 28011 1726882531.05800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 28011 1726882531.05803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 28011 1726882531.05844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 28011 1726882531.05869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745937c80> import '_tokenize' # <<< 28011 1726882531.05941: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745936750> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57459364b0><<< 28011 1726882531.06028: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 28011 1726882531.06047: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745936a20> <<< 28011 1726882531.06063: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574592e5a0> <<< 28011 1726882531.06135: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882531.06156: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574597bec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574597bf20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 28011 1726882531.06182: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 28011 1726882531.06263: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574597da00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574597d7c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 28011 1726882531.06266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 28011 1726882531.06312: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574597ffb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574597e0f0> <<< 28011 1726882531.06333: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 28011 1726882531.06454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882531.06473: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57459837a0> <<< 28011 1726882531.06565: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745980170> <<< 28011 1726882531.06621: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882531.06637: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5745984500> <<< 28011 1726882531.06660: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57459847a0> <<< 28011 1726882531.06695: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5745984ad0> <<< 28011 1726882531.06821: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574597c080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 28011 1726882531.06824: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5745810170> <<< 28011 1726882531.06957: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882531.06978: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57458115b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745986900> <<< 28011 1726882531.07026: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5745987cb0> <<< 28011 1726882531.07061: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745986570> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 28011 1726882531.07171: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.07235: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28011 1726882531.07282: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 28011 1726882531.07305: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.07443: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.07514: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.08440: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.08561: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 28011 1726882531.08576: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 28011 1726882531.08607: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 28011 1726882531.08621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882531.08661: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5745815700> <<< 28011 1726882531.08742: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 28011 1726882531.08770: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458164b0> <<< 28011 1726882531.08785: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745986ab0> <<< 28011 1726882531.08819: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 28011 1726882531.08855: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.08965: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 28011 1726882531.09014: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.09172: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 28011 1726882531.09199: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745816540> # zipimport: zlib available <<< 28011 1726882531.09635: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.10072: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.10139: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.10281: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 28011 1726882531.10307: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 28011 1726882531.10370: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.10448: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 28011 1726882531.10473: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.10619: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 28011 1726882531.10622: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 28011 1726882531.10797: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.11098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 28011 1726882531.11116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 28011 1726882531.11152: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745817710> # zipimport: zlib available <<< 28011 1726882531.11421: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 28011 1726882531.11469: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.11513: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.11564: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.11637: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 28011 1726882531.11665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882531.11747: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5745822090> <<< 28011 1726882531.11773: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574581d070> <<< 28011 1726882531.11807: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 28011 1726882531.11874: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.11934: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.11976: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.12084: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 28011 1726882531.12116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 28011 1726882531.12149: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 28011 1726882531.12205: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574590aab0> <<< 28011 1726882531.12319: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57459fe780> <<< 28011 1726882531.12337: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458221e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745816ff0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 28011 1726882531.12378: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.12381: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 28011 1726882531.12394: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 28011 1726882531.12541: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 28011 1726882531.12546: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.13076: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # <<< 28011 1726882531.13079: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 28011 1726882531.13174: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.13346: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.13387: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.13446: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882531.13474: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 28011 1726882531.13481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 28011 1726882531.13505: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 28011 1726882531.13532: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 28011 1726882531.13607: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458b6570> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 28011 1726882531.13642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 28011 1726882531.13663: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 28011 1726882531.13707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 28011 1726882531.13722: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57454fc290> <<< 28011 1726882531.13771: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57454fc5f0> <<< 28011 1726882531.13786: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458a7410> <<< 28011 1726882531.13795: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458b70e0> <<< 28011 1726882531.13829: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458b4cb0> <<< 28011 1726882531.13848: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458b48f0> <<< 28011 1726882531.13915: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 28011 1726882531.13988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57454ff530> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57454fede0> <<< 28011 1726882531.14049: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57454fef90> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57454fe240> <<< 28011 1726882531.14056: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 28011 1726882531.14182: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 28011 1726882531.14186: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57454ff6b0> <<< 28011 1726882531.14211: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 28011 1726882531.14215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 28011 1726882531.14316: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574555e120> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574555c1a0> <<< 28011 1726882531.14327: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458b60c0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 28011 1726882531.14368: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.14373: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 28011 1726882531.14535: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 28011 1726882531.14565: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.14615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 28011 1726882531.14692: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.14757: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 28011 1726882531.14781: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.14832: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 28011 1726882531.14838: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.14889: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.14974: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 28011 1726882531.15007: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.15066: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.15139: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.15499: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 28011 1726882531.15672: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.16229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 28011 1726882531.16246: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.16283: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 28011 1726882531.16288: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.16324: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.16354: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 28011 1726882531.16360: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.16427: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.16480: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 28011 1726882531.16503: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.16707: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 28011 1726882531.16741: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.16807: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 28011 1726882531.16847: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574555f500> <<< 28011 1726882531.16999: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 28011 1726882531.17025: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574555eb40> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 28011 1726882531.17074: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.17137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 28011 1726882531.17143: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.17316: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.17331: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 28011 1726882531.17338: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.17404: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.17474: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 28011 1726882531.17480: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.17527: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.17575: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 28011 1726882531.17807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574559a2a0> <<< 28011 1726882531.17927: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574558aea0> <<< 28011 1726882531.17930: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 28011 1726882531.17936: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.18016: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.18056: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 28011 1726882531.18063: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.18148: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.18231: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.18347: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.18489: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 28011 1726882531.18558: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.18561: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.18586: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 28011 1726882531.18595: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.18643: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.18704: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 28011 1726882531.18707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 28011 1726882531.18907: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57455add90> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57455af740> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 28011 1726882531.19077: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.19192: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 28011 1726882531.19205: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.19303: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.19446: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28011 1726882531.19514: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 28011 1726882531.19538: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.19560: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.19695: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.19844: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 28011 1726882531.19847: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.19971: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.20103: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 28011 1726882531.20106: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.20134: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.20213: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.20717: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.21229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 28011 1726882531.21259: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 28011 1726882531.21532: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 28011 1726882531.21730: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 28011 1726882531.21817: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.21972: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 28011 1726882531.21998: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.22020: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 28011 1726882531.22096: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28011 1726882531.22188: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 28011 1726882531.22223: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.22341: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.22520: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.22722: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 28011 1726882531.22744: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.22775: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.22814: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 28011 1726882531.22853: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.22856: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.22881: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 28011 1726882531.22961: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.23028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 28011 1726882531.23043: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.23057: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.23092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 28011 1726882531.23109: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.23158: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.23219: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 28011 1726882531.23232: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.23289: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.23351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 28011 1726882531.23354: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.23625: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.23882: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 28011 1726882531.23939: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.24009: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 28011 1726882531.24012: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.24047: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.24102: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 28011 1726882531.24114: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.24138: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.24165: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 28011 1726882531.24168: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.24224: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.24250: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 28011 1726882531.24362: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.24648: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 28011 1726882531.24725: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 28011 1726882531.24758: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.24840: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 28011 1726882531.24862: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 28011 1726882531.24972: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.24996: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 28011 1726882531.25162: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.25361: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 28011 1726882531.25421: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.25456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 28011 1726882531.25573: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.25582: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.25602: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 28011 1726882531.25654: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.25753: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 28011 1726882531.25853: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.25975: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 28011 1726882531.26003: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882531.26687: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 28011 1726882531.26743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 28011 1726882531.26749: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 28011 1726882531.26754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 28011 1726882531.26841: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882531.26845: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57453aee10> <<< 28011 1726882531.26858: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57453af020> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57453ac7a0> <<< 28011 1726882531.40734: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 28011 1726882531.40749: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 28011 1726882531.40808: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57453f5430> <<< 28011 1726882531.40812: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 28011 1726882531.40837: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57453f6270> <<< 28011 1726882531.40928: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 28011 1726882531.40976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574559c9b0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574559c470> <<< 28011 1726882531.41243: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 28011 1726882531.61363: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_loadavg": {"1m": 0.482421875, "5m": 0.38818359375, "15m": 0.21142578125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F<<< 28011 1726882531.61437: stdout chunk (state=3): >>>2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "31", "epoch": "1726882531", "epoch_int": "1726882531", "date": "2024-09-20", "time": "21:35:31", "iso8601_micro": "2024-09-21T01:35:31.273663Z", "iso8601": "2024-09-21T01:35:31Z", "iso8601_basic": "20240920T213531273663", "iso8601_basic_short": "20240920T213531", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tl<<< 28011 1726882531.61447: stdout chunk (state=3): >>>s_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 964, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794037760, "block_size": 4096, "block_total": 65519099, "block_available": 63914560, "block_used": 1604539, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28011 1726882531.61992: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 28011 1726882531.62003: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc<<< 28011 1726882531.62019: stdout chunk (state=3): >>> # clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 28011 1726882531.62031: stdout chunk (state=3): >>># clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ <<< 28011 1726882531.62042: stdout chunk (state=3): >>># cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site <<< 28011 1726882531.62066: stdout chunk (state=3): >>># destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression <<< 28011 1726882531.62081: stdout chunk (state=3): >>># cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile <<< 28011 1726882531.62097: stdout chunk (state=3): >>># cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess <<< 28011 1726882531.62123: stdout chunk (state=3): >>># cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves <<< 28011 1726882531.62129: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool<<< 28011 1726882531.62162: stdout chunk (state=3): >>> # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext <<< 28011 1726882531.62185: stdout chunk (state=3): >>># destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime <<< 28011 1726882531.62195: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl <<< 28011 1726882531.62214: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl <<< 28011 1726882531.62240: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb <<< 28011 1726882531.62251: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd <<< 28011 1726882531.62262: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 28011 1726882531.62574: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 28011 1726882531.62585: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 28011 1726882531.62616: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 28011 1726882531.62624: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 28011 1726882531.62647: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 28011 1726882531.62677: stdout chunk (state=3): >>># destroy ntpath <<< 28011 1726882531.62695: stdout chunk (state=3): >>># destroy importlib <<< 28011 1726882531.62713: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 28011 1726882531.62733: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 28011 1726882531.62749: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select <<< 28011 1726882531.62762: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess <<< 28011 1726882531.62770: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 28011 1726882531.62809: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 28011 1726882531.62818: stdout chunk (state=3): >>># destroy distro # destroy distro.distro <<< 28011 1726882531.62825: stdout chunk (state=3): >>># destroy argparse # destroy logging <<< 28011 1726882531.62860: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 28011 1726882531.62870: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 28011 1726882531.62897: stdout chunk (state=3): >>># destroy _pickle <<< 28011 1726882531.62904: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction <<< 28011 1726882531.62917: stdout chunk (state=3): >>># destroy selectors <<< 28011 1726882531.62938: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 28011 1726882531.62941: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 28011 1726882531.62971: stdout chunk (state=3): >>># destroy _ssl <<< 28011 1726882531.62986: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 28011 1726882531.62996: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy json <<< 28011 1726882531.63023: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 28011 1726882531.63040: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection <<< 28011 1726882531.63046: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 28011 1726882531.63101: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 28011 1726882531.63108: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser <<< 28011 1726882531.63127: stdout chunk (state=3): >>># cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 28011 1726882531.63142: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 28011 1726882531.63160: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 28011 1726882531.63166: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 28011 1726882531.63209: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 28011 1726882531.63215: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 28011 1726882531.63219: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath <<< 28011 1726882531.63236: stdout chunk (state=3): >>># cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 28011 1726882531.63249: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 28011 1726882531.63391: stdout chunk (state=3): >>># destroy sys.monitoring <<< 28011 1726882531.63397: stdout chunk (state=3): >>># destroy _socket <<< 28011 1726882531.63414: stdout chunk (state=3): >>># destroy _collections <<< 28011 1726882531.63438: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 28011 1726882531.63445: stdout chunk (state=3): >>># destroy tokenize <<< 28011 1726882531.63472: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 28011 1726882531.63517: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 28011 1726882531.63520: stdout chunk (state=3): >>># destroy _typing <<< 28011 1726882531.63554: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 28011 1726882531.63587: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 28011 1726882531.63702: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 28011 1726882531.63705: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time <<< 28011 1726882531.63708: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 28011 1726882531.63710: stdout chunk (state=3): >>># destroy _hashlib <<< 28011 1726882531.63744: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 28011 1726882531.63748: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 28011 1726882531.64157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882531.64160: stdout chunk (state=3): >>><<< 28011 1726882531.64163: stderr chunk (state=3): >>><<< 28011 1726882531.64388: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57465104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57464dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5746512a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57462e5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57462e5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5746323dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5746323fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574635b7a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574635be30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574633baa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463391c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5746320f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574637b710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574637a330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574633a090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5746378b90> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b0740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5746320200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57463b0bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b0aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57463b0e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574631ed20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b1580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b1250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b2480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463c8680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57463c9d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463cac00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57463cb260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463ca150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57463cbce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463cb410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57460c7bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57460f06e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57460f0440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57460f0710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57460f1040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57460f1a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57460f08f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57460c5d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57460f2de0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57460f1b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57463b2ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574611f140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574613f500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57461a02c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57461a2a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57461a03e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57461692e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745fa93d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574613e300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57460f3d10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f574613e900> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ywgkzq93/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574600f0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745fedfd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745fed160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574600cfb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574603e960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574603e6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574603e030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574603e750> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574600fd70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574603f680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574603f8c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574603fe00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745929bb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574592b7d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574592c1d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574592d370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574592fe60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57463cab70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574592e120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745937c80> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745936750> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57459364b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745936a20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574592e5a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574597bec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574597bf20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574597da00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574597d7c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574597ffb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574597e0f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57459837a0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745980170> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5745984500> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57459847a0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5745984ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574597c080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5745810170> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57458115b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745986900> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5745987cb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745986570> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5745815700> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458164b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745986ab0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745816540> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745817710> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5745822090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574581d070> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574590aab0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57459fe780> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458221e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5745816ff0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458b6570> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57454fc290> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57454fc5f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458a7410> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458b70e0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458b4cb0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458b48f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57454ff530> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57454fede0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57454fef90> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57454fe240> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57454ff6b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574555e120> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574555c1a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57458b60c0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574555f500> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574555eb40> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f574559a2a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574558aea0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57455add90> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57455af740> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57453aee10> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57453af020> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57453ac7a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57453f5430> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57453f6270> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574559c9b0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f574559c470> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_loadavg": {"1m": 0.482421875, "5m": 0.38818359375, "15m": 0.21142578125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "31", "epoch": "1726882531", "epoch_int": "1726882531", "date": "2024-09-20", "time": "21:35:31", "iso8601_micro": "2024-09-21T01:35:31.273663Z", "iso8601": "2024-09-21T01:35:31Z", "iso8601_basic": "20240920T213531273663", "iso8601_basic_short": "20240920T213531", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 964, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794037760, "block_size": 4096, "block_total": 65519099, "block_available": 63914560, "block_used": 1604539, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 28011 1726882531.66115: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882531.66118: _low_level_execute_command(): starting 28011 1726882531.66121: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882530.492573-28055-55765411688420/ > /dev/null 2>&1 && sleep 0' 28011 1726882531.66364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882531.66379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882531.66395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882531.66422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882531.66499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882531.66520: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882531.66569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882531.66596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882531.66642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882531.66686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882531.68619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882531.68623: stdout chunk (state=3): >>><<< 28011 1726882531.68625: stderr chunk (state=3): >>><<< 28011 1726882531.68628: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882531.68630: handler run complete 28011 1726882531.68836: variable 'ansible_facts' from source: unknown 28011 1726882531.68998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882531.69785: variable 'ansible_facts' from source: unknown 28011 1726882531.69878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882531.70037: attempt loop complete, returning result 28011 1726882531.70040: _execute() done 28011 1726882531.70042: dumping result to json 28011 1726882531.70078: done dumping result, returning 28011 1726882531.70112: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-962d-7c65-0000000000f5] 28011 1726882531.70142: sending task result for task 12673a56-9f93-962d-7c65-0000000000f5 ok: [managed_node1] 28011 1726882531.71179: no more pending results, returning what we have 28011 1726882531.71182: results queue empty 28011 1726882531.71183: checking for any_errors_fatal 28011 1726882531.71184: done checking for any_errors_fatal 28011 1726882531.71185: checking for max_fail_percentage 28011 1726882531.71187: done checking for max_fail_percentage 28011 1726882531.71188: checking to see if all hosts have failed and the running result is not ok 28011 1726882531.71188: done checking to see if all hosts have failed 28011 1726882531.71189: getting the remaining hosts for this loop 28011 1726882531.71191: done getting the remaining hosts for this loop 28011 1726882531.71196: getting the next task for host managed_node1 28011 1726882531.71203: done getting next task for host managed_node1 28011 1726882531.71205: ^ task is: TASK: meta (flush_handlers) 28011 1726882531.71206: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882531.71211: getting variables 28011 1726882531.71212: in VariableManager get_vars() 28011 1726882531.71235: Calling all_inventory to load vars for managed_node1 28011 1726882531.71238: Calling groups_inventory to load vars for managed_node1 28011 1726882531.71241: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882531.71344: done sending task result for task 12673a56-9f93-962d-7c65-0000000000f5 28011 1726882531.71347: WORKER PROCESS EXITING 28011 1726882531.71384: Calling all_plugins_play to load vars for managed_node1 28011 1726882531.71388: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882531.71391: Calling groups_plugins_play to load vars for managed_node1 28011 1726882531.71633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882531.71998: done with get_vars() 28011 1726882531.72008: done getting variables 28011 1726882531.72072: in VariableManager get_vars() 28011 1726882531.72082: Calling all_inventory to load vars for managed_node1 28011 1726882531.72084: Calling groups_inventory to load vars for managed_node1 28011 1726882531.72086: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882531.72096: Calling all_plugins_play to load vars for managed_node1 28011 1726882531.72099: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882531.72102: Calling groups_plugins_play to load vars for managed_node1 28011 1726882531.72243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882531.72449: done with get_vars() 28011 1726882531.72462: done queuing things up, now waiting for results queue to drain 28011 1726882531.72464: results queue empty 28011 1726882531.72464: checking for any_errors_fatal 28011 1726882531.72466: done checking for any_errors_fatal 28011 1726882531.72471: checking for max_fail_percentage 28011 1726882531.72472: done checking for max_fail_percentage 28011 1726882531.72473: checking to see if all hosts have failed and the running result is not ok 28011 1726882531.72474: done checking to see if all hosts have failed 28011 1726882531.72474: getting the remaining hosts for this loop 28011 1726882531.72475: done getting the remaining hosts for this loop 28011 1726882531.72477: getting the next task for host managed_node1 28011 1726882531.72482: done getting next task for host managed_node1 28011 1726882531.72484: ^ task is: TASK: Include the task 'el_repo_setup.yml' 28011 1726882531.72485: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882531.72488: getting variables 28011 1726882531.72489: in VariableManager get_vars() 28011 1726882531.72501: Calling all_inventory to load vars for managed_node1 28011 1726882531.72503: Calling groups_inventory to load vars for managed_node1 28011 1726882531.72505: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882531.72509: Calling all_plugins_play to load vars for managed_node1 28011 1726882531.72511: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882531.72514: Calling groups_plugins_play to load vars for managed_node1 28011 1726882531.72648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882531.72827: done with get_vars() 28011 1726882531.72836: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:11 Friday 20 September 2024 21:35:31 -0400 (0:00:01.272) 0:00:01.280 ****** 28011 1726882531.72917: entering _queue_task() for managed_node1/include_tasks 28011 1726882531.72920: Creating lock for include_tasks 28011 1726882531.73212: worker is 1 (out of 1 available) 28011 1726882531.73224: exiting _queue_task() for managed_node1/include_tasks 28011 1726882531.73235: done queuing things up, now waiting for results queue to drain 28011 1726882531.73237: waiting for pending results... 28011 1726882531.73476: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 28011 1726882531.73562: in run() - task 12673a56-9f93-962d-7c65-000000000006 28011 1726882531.73576: variable 'ansible_search_path' from source: unknown 28011 1726882531.73618: calling self._execute() 28011 1726882531.73684: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882531.73689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882531.73705: variable 'omit' from source: magic vars 28011 1726882531.73812: _execute() done 28011 1726882531.73816: dumping result to json 28011 1726882531.73818: done dumping result, returning 28011 1726882531.73828: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [12673a56-9f93-962d-7c65-000000000006] 28011 1726882531.73851: sending task result for task 12673a56-9f93-962d-7c65-000000000006 28011 1726882531.74010: done sending task result for task 12673a56-9f93-962d-7c65-000000000006 28011 1726882531.74014: WORKER PROCESS EXITING 28011 1726882531.74052: no more pending results, returning what we have 28011 1726882531.74056: in VariableManager get_vars() 28011 1726882531.74082: Calling all_inventory to load vars for managed_node1 28011 1726882531.74085: Calling groups_inventory to load vars for managed_node1 28011 1726882531.74088: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882531.74101: Calling all_plugins_play to load vars for managed_node1 28011 1726882531.74103: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882531.74106: Calling groups_plugins_play to load vars for managed_node1 28011 1726882531.74343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882531.74524: done with get_vars() 28011 1726882531.74531: variable 'ansible_search_path' from source: unknown 28011 1726882531.74544: we have included files to process 28011 1726882531.74545: generating all_blocks data 28011 1726882531.74546: done generating all_blocks data 28011 1726882531.74547: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28011 1726882531.74548: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28011 1726882531.74550: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28011 1726882531.75183: in VariableManager get_vars() 28011 1726882531.75203: done with get_vars() 28011 1726882531.75215: done processing included file 28011 1726882531.75218: iterating over new_blocks loaded from include file 28011 1726882531.75219: in VariableManager get_vars() 28011 1726882531.75229: done with get_vars() 28011 1726882531.75230: filtering new block on tags 28011 1726882531.75244: done filtering new block on tags 28011 1726882531.75247: in VariableManager get_vars() 28011 1726882531.75257: done with get_vars() 28011 1726882531.75258: filtering new block on tags 28011 1726882531.75272: done filtering new block on tags 28011 1726882531.75275: in VariableManager get_vars() 28011 1726882531.75285: done with get_vars() 28011 1726882531.75287: filtering new block on tags 28011 1726882531.75303: done filtering new block on tags 28011 1726882531.75305: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 28011 1726882531.75310: extending task lists for all hosts with included blocks 28011 1726882531.75355: done extending task lists 28011 1726882531.75356: done processing included files 28011 1726882531.75357: results queue empty 28011 1726882531.75357: checking for any_errors_fatal 28011 1726882531.75358: done checking for any_errors_fatal 28011 1726882531.75359: checking for max_fail_percentage 28011 1726882531.75360: done checking for max_fail_percentage 28011 1726882531.75361: checking to see if all hosts have failed and the running result is not ok 28011 1726882531.75361: done checking to see if all hosts have failed 28011 1726882531.75362: getting the remaining hosts for this loop 28011 1726882531.75363: done getting the remaining hosts for this loop 28011 1726882531.75365: getting the next task for host managed_node1 28011 1726882531.75369: done getting next task for host managed_node1 28011 1726882531.75371: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 28011 1726882531.75373: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882531.75374: getting variables 28011 1726882531.75375: in VariableManager get_vars() 28011 1726882531.75383: Calling all_inventory to load vars for managed_node1 28011 1726882531.75385: Calling groups_inventory to load vars for managed_node1 28011 1726882531.75387: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882531.75396: Calling all_plugins_play to load vars for managed_node1 28011 1726882531.75399: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882531.75402: Calling groups_plugins_play to load vars for managed_node1 28011 1726882531.75553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882531.75735: done with get_vars() 28011 1726882531.75743: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:35:31 -0400 (0:00:00.028) 0:00:01.309 ****** 28011 1726882531.75806: entering _queue_task() for managed_node1/setup 28011 1726882531.76021: worker is 1 (out of 1 available) 28011 1726882531.76031: exiting _queue_task() for managed_node1/setup 28011 1726882531.76042: done queuing things up, now waiting for results queue to drain 28011 1726882531.76044: waiting for pending results... 28011 1726882531.76373: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 28011 1726882531.76377: in run() - task 12673a56-9f93-962d-7c65-000000000106 28011 1726882531.76380: variable 'ansible_search_path' from source: unknown 28011 1726882531.76382: variable 'ansible_search_path' from source: unknown 28011 1726882531.76401: calling self._execute() 28011 1726882531.76471: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882531.76474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882531.76482: variable 'omit' from source: magic vars 28011 1726882531.76985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882531.79078: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882531.79145: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882531.79181: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882531.79219: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882531.79249: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882531.79326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882531.79358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882531.79488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882531.79492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882531.79496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882531.79612: variable 'ansible_facts' from source: unknown 28011 1726882531.79672: variable 'network_test_required_facts' from source: task vars 28011 1726882531.79715: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 28011 1726882531.79726: variable 'omit' from source: magic vars 28011 1726882531.79765: variable 'omit' from source: magic vars 28011 1726882531.79809: variable 'omit' from source: magic vars 28011 1726882531.79841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882531.79871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882531.79901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882531.79928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882531.79945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882531.79978: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882531.79987: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882531.80000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882531.80103: Set connection var ansible_connection to ssh 28011 1726882531.80142: Set connection var ansible_pipelining to False 28011 1726882531.80146: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882531.80148: Set connection var ansible_shell_executable to /bin/sh 28011 1726882531.80151: Set connection var ansible_timeout to 10 28011 1726882531.80155: Set connection var ansible_shell_type to sh 28011 1726882531.80186: variable 'ansible_shell_executable' from source: unknown 28011 1726882531.80251: variable 'ansible_connection' from source: unknown 28011 1726882531.80254: variable 'ansible_module_compression' from source: unknown 28011 1726882531.80256: variable 'ansible_shell_type' from source: unknown 28011 1726882531.80258: variable 'ansible_shell_executable' from source: unknown 28011 1726882531.80260: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882531.80262: variable 'ansible_pipelining' from source: unknown 28011 1726882531.80264: variable 'ansible_timeout' from source: unknown 28011 1726882531.80266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882531.80383: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882531.80404: variable 'omit' from source: magic vars 28011 1726882531.80414: starting attempt loop 28011 1726882531.80422: running the handler 28011 1726882531.80441: _low_level_execute_command(): starting 28011 1726882531.80453: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882531.81277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882531.81296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882531.81313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882531.81397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882531.82965: stdout chunk (state=3): >>>/root <<< 28011 1726882531.83125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882531.83128: stdout chunk (state=3): >>><<< 28011 1726882531.83131: stderr chunk (state=3): >>><<< 28011 1726882531.83155: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882531.83178: _low_level_execute_command(): starting 28011 1726882531.83192: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535 `" && echo ansible-tmp-1726882531.8316693-28103-226582408038535="` echo /root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535 `" ) && sleep 0' 28011 1726882531.83845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882531.83853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882531.83899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882531.85767: stdout chunk (state=3): >>>ansible-tmp-1726882531.8316693-28103-226582408038535=/root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535 <<< 28011 1726882531.85885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882531.85930: stderr chunk (state=3): >>><<< 28011 1726882531.85948: stdout chunk (state=3): >>><<< 28011 1726882531.86100: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882531.8316693-28103-226582408038535=/root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882531.86104: variable 'ansible_module_compression' from source: unknown 28011 1726882531.86107: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28011 1726882531.86140: variable 'ansible_facts' from source: unknown 28011 1726882531.86353: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535/AnsiballZ_setup.py 28011 1726882531.86571: Sending initial data 28011 1726882531.86574: Sent initial data (154 bytes) 28011 1726882531.87152: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882531.87176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882531.87203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882531.87231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882531.87278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882531.87370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882531.87404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882531.87433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882531.87519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882531.89046: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28011 1726882531.89057: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882531.89087: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882531.89130: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp8s52unvq /root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535/AnsiballZ_setup.py <<< 28011 1726882531.89133: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535/AnsiballZ_setup.py" <<< 28011 1726882531.89176: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp8s52unvq" to remote "/root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535/AnsiballZ_setup.py" <<< 28011 1726882531.89178: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535/AnsiballZ_setup.py" <<< 28011 1726882531.90285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882531.90419: stderr chunk (state=3): >>><<< 28011 1726882531.90423: stdout chunk (state=3): >>><<< 28011 1726882531.90425: done transferring module to remote 28011 1726882531.90427: _low_level_execute_command(): starting 28011 1726882531.90429: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535/ /root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535/AnsiballZ_setup.py && sleep 0' 28011 1726882531.90944: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882531.90957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882531.90968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882531.90977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882531.91024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882531.91046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882531.91086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882531.92811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882531.92823: stderr chunk (state=3): >>><<< 28011 1726882531.92826: stdout chunk (state=3): >>><<< 28011 1726882531.92838: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882531.92841: _low_level_execute_command(): starting 28011 1726882531.92843: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535/AnsiballZ_setup.py && sleep 0' 28011 1726882531.93372: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882531.93387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882531.93410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882531.93459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882531.93511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882531.93543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882531.93581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882531.93626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882531.95719: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 28011 1726882531.95756: stdout chunk (state=3): >>>import _imp # builtin <<< 28011 1726882531.95784: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 28011 1726882531.95792: stdout chunk (state=3): >>>import '_weakref' # <<< 28011 1726882531.95861: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 28011 1726882531.95906: stdout chunk (state=3): >>>import 'posix' # <<< 28011 1726882531.95937: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 28011 1726882531.95964: stdout chunk (state=3): >>>import 'time' # <<< 28011 1726882531.95984: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 28011 1726882531.96042: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 28011 1726882531.96057: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 28011 1726882531.96077: stdout chunk (state=3): >>>import 'codecs' # <<< 28011 1726882531.96124: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 28011 1726882531.96167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea1684d0> <<< 28011 1726882531.96170: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea137b30> <<< 28011 1726882531.96192: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea16aa50> <<< 28011 1726882531.96213: stdout chunk (state=3): >>>import '_signal' # <<< 28011 1726882531.96255: stdout chunk (state=3): >>>import '_abc' # <<< 28011 1726882531.96258: stdout chunk (state=3): >>>import 'abc' # <<< 28011 1726882531.96300: stdout chunk (state=3): >>>import 'io' # <<< 28011 1726882531.96303: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 28011 1726882531.96384: stdout chunk (state=3): >>>import '_collections_abc' # <<< 28011 1726882531.96420: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 28011 1726882531.96443: stdout chunk (state=3): >>>import 'os' # <<< 28011 1726882531.96479: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 28011 1726882531.96488: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 28011 1726882531.96511: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 28011 1726882531.96544: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 28011 1726882531.96596: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f3d130> <<< 28011 1726882531.96648: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f3dfa0> <<< 28011 1726882531.96682: stdout chunk (state=3): >>>import 'site' # <<< 28011 1726882531.96705: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28011 1726882531.97110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 28011 1726882531.97125: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882531.97204: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 28011 1726882531.97225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 28011 1726882531.97236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 28011 1726882531.97256: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f7be90> <<< 28011 1726882531.97274: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 28011 1726882531.97311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 28011 1726882531.97314: stdout chunk (state=3): >>>import '_operator' # <<< 28011 1726882531.97321: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f7bf50> <<< 28011 1726882531.97331: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 28011 1726882531.97359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 28011 1726882531.97381: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 28011 1726882531.97436: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882531.97448: stdout chunk (state=3): >>>import 'itertools' # <<< 28011 1726882531.97479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 28011 1726882531.97487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9fb3830> <<< 28011 1726882531.97506: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 28011 1726882531.97524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9fb3ec0> <<< 28011 1726882531.97537: stdout chunk (state=3): >>>import '_collections' # <<< 28011 1726882531.97585: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f93b60> <<< 28011 1726882531.97596: stdout chunk (state=3): >>>import '_functools' # <<< 28011 1726882531.97628: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f91280> <<< 28011 1726882531.97716: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f79040> <<< 28011 1726882531.97748: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 28011 1726882531.97760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 28011 1726882531.97777: stdout chunk (state=3): >>>import '_sre' # <<< 28011 1726882531.97799: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 28011 1726882531.97829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 28011 1726882531.97846: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 28011 1726882531.97856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 28011 1726882531.97885: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9fd37d0> <<< 28011 1726882531.97901: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9fd23f0> <<< 28011 1726882531.97927: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 28011 1726882531.97936: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f92150> <<< 28011 1726882531.97949: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9fd0c20> <<< 28011 1726882531.97998: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 28011 1726882531.98005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea008860> <<< 28011 1726882531.98029: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f782c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 28011 1726882531.98051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 28011 1726882531.98071: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882531.98074: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08ea008d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea008bc0> <<< 28011 1726882531.98118: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08ea008f80> <<< 28011 1726882531.98127: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f76de0> <<< 28011 1726882531.98163: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 28011 1726882531.98170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882531.98184: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 28011 1726882531.98223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 28011 1726882531.98230: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea009610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea0092e0> <<< 28011 1726882531.98242: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 28011 1726882531.98274: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 28011 1726882531.98299: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea00a510> <<< 28011 1726882531.98315: stdout chunk (state=3): >>>import 'importlib.util' # <<< 28011 1726882531.98320: stdout chunk (state=3): >>>import 'runpy' # <<< 28011 1726882531.98342: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 28011 1726882531.98378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 28011 1726882531.98405: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 28011 1726882531.98408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea020710> <<< 28011 1726882531.98427: stdout chunk (state=3): >>>import 'errno' # <<< 28011 1726882531.98459: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882531.98467: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08ea021df0> <<< 28011 1726882531.98491: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 28011 1726882531.98502: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 28011 1726882531.98534: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 28011 1726882531.98548: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea022c90> <<< 28011 1726882531.98588: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882531.98598: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08ea0232f0> <<< 28011 1726882531.98610: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea0221e0> <<< 28011 1726882531.98620: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 28011 1726882531.98635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 28011 1726882531.98666: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882531.98682: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08ea023d70> <<< 28011 1726882531.98687: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea0234a0> <<< 28011 1726882531.98732: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea00a540> <<< 28011 1726882531.98752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 28011 1726882531.98775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 28011 1726882531.98798: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 28011 1726882531.98821: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 28011 1726882531.98853: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9d17bf0> <<< 28011 1726882531.98875: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 28011 1726882531.98913: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882531.98918: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9d406e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d40440> <<< 28011 1726882531.98947: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882531.98950: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9d40710> <<< 28011 1726882531.98981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 28011 1726882531.99082: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882531.99198: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9d41040> <<< 28011 1726882531.99314: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9d419a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d408f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d15d90> <<< 28011 1726882531.99350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 28011 1726882531.99362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 28011 1726882531.99408: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 28011 1726882531.99422: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d42db0> <<< 28011 1726882531.99451: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d41af0> <<< 28011 1726882531.99465: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea00ac30> <<< 28011 1726882531.99486: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 28011 1726882531.99547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882531.99557: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 28011 1726882531.99636: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 28011 1726882531.99659: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d6f110> <<< 28011 1726882531.99710: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882531.99736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 28011 1726882531.99739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 28011 1726882531.99774: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d8f4a0> <<< 28011 1726882531.99802: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 28011 1726882531.99841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 28011 1726882531.99892: stdout chunk (state=3): >>>import 'ntpath' # <<< 28011 1726882531.99914: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 28011 1726882531.99919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9df0260> <<< 28011 1726882531.99938: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 28011 1726882531.99970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 28011 1726882531.99995: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 28011 1726882532.00036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 28011 1726882532.00117: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9df29c0> <<< 28011 1726882532.00199: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9df0380> <<< 28011 1726882532.00228: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9db9280> <<< 28011 1726882532.00258: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9725340> <<< 28011 1726882532.00281: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d8e2a0> <<< 28011 1726882532.00287: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d43ce0> <<< 28011 1726882532.00457: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 28011 1726882532.00480: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f08e97255b0> <<< 28011 1726882532.00750: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_bguwpstq/ansible_setup_payload.zip' <<< 28011 1726882532.00755: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.00871: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.00905: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 28011 1726882532.00911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 28011 1726882532.00956: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 28011 1726882532.01028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 28011 1726882532.01064: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e978f0b0> <<< 28011 1726882532.01069: stdout chunk (state=3): >>>import '_typing' # <<< 28011 1726882532.01257: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e976dfa0> <<< 28011 1726882532.01260: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e976d130> # zipimport: zlib available <<< 28011 1726882532.01297: stdout chunk (state=3): >>>import 'ansible' # <<< 28011 1726882532.01302: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.01326: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.01335: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.01351: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 28011 1726882532.01363: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.02748: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.03848: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 28011 1726882532.03853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e978cf80> <<< 28011 1726882532.03876: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 28011 1726882532.03880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.03912: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 28011 1726882532.03938: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 28011 1726882532.03943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 28011 1726882532.03969: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e97be9c0> <<< 28011 1726882532.04008: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e97be750> <<< 28011 1726882532.04042: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e97be060> <<< 28011 1726882532.04063: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 28011 1726882532.04071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 28011 1726882532.04113: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e97be7e0> <<< 28011 1726882532.04119: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e978fd40> import 'atexit' # <<< 28011 1726882532.04150: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e97bf710> <<< 28011 1726882532.04180: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.04185: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e97bf950> <<< 28011 1726882532.04207: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 28011 1726882532.04254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 28011 1726882532.04258: stdout chunk (state=3): >>>import '_locale' # <<< 28011 1726882532.04308: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e97bfe90> <<< 28011 1726882532.04313: stdout chunk (state=3): >>>import 'pwd' # <<< 28011 1726882532.04338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 28011 1726882532.04362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 28011 1726882532.04401: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9629ca0> <<< 28011 1726882532.04427: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.04435: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e962b8c0> <<< 28011 1726882532.04448: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 28011 1726882532.04471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 28011 1726882532.04506: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e962c2c0> <<< 28011 1726882532.04524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 28011 1726882532.04554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 28011 1726882532.04571: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e962d460> <<< 28011 1726882532.04594: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 28011 1726882532.04625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 28011 1726882532.04649: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 28011 1726882532.04658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 28011 1726882532.04708: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e962ff20> <<< 28011 1726882532.04742: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9d402f0> <<< 28011 1726882532.04770: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e962e1e0> <<< 28011 1726882532.04783: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 28011 1726882532.04815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 28011 1726882532.04837: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 28011 1726882532.04842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 28011 1726882532.04856: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 28011 1726882532.04950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 28011 1726882532.04972: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 28011 1726882532.04983: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 28011 1726882532.04988: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9637e00> <<< 28011 1726882532.05009: stdout chunk (state=3): >>>import '_tokenize' # <<< 28011 1726882532.05069: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e96368d0> <<< 28011 1726882532.05075: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9636630> <<< 28011 1726882532.05090: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 28011 1726882532.05107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 28011 1726882532.05175: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9636ba0> <<< 28011 1726882532.05208: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e962e6f0> <<< 28011 1726882532.05234: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.05241: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e967bf20> <<< 28011 1726882532.05268: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e967c200> <<< 28011 1726882532.05299: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 28011 1726882532.05307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 28011 1726882532.05334: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 28011 1726882532.05376: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e967dca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e967da60> <<< 28011 1726882532.05400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 28011 1726882532.05429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 28011 1726882532.05476: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9680230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e967e360> <<< 28011 1726882532.05506: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 28011 1726882532.05540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.05568: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 28011 1726882532.05575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 28011 1726882532.05587: stdout chunk (state=3): >>>import '_string' # <<< 28011 1726882532.05626: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e96839e0> <<< 28011 1726882532.05748: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e96803e0> <<< 28011 1726882532.05811: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.05816: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9684ce0> <<< 28011 1726882532.05842: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.05847: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9684a10> <<< 28011 1726882532.05884: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e96843b0> <<< 28011 1726882532.05907: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e967c3b0> <<< 28011 1726882532.05925: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 28011 1726882532.05933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 28011 1726882532.05943: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 28011 1726882532.05969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 28011 1726882532.05997: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.06023: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9510410> <<< 28011 1726882532.06163: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.06171: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9511370> <<< 28011 1726882532.06185: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9686bd0> <<< 28011 1726882532.06220: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.06224: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9687f20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e96867e0> <<< 28011 1726882532.06238: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.06254: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.06262: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 28011 1726882532.06273: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.06361: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.06453: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.06467: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 28011 1726882532.06478: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.06501: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.06512: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 28011 1726882532.06519: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.06636: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.06757: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.07278: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.07819: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 28011 1726882532.07826: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 28011 1726882532.07858: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 28011 1726882532.07868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.07921: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9515790> <<< 28011 1726882532.08007: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 28011 1726882532.08021: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9516c00> <<< 28011 1726882532.08033: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95115e0> <<< 28011 1726882532.08079: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 28011 1726882532.08085: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.08117: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.08127: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 28011 1726882532.08141: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.08284: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.08441: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 28011 1726882532.08461: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9516c90> <<< 28011 1726882532.08469: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.08922: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.09361: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.09437: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.09514: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 28011 1726882532.09521: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.09559: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.09598: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 28011 1726882532.09603: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.09675: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.09755: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 28011 1726882532.09782: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28011 1726882532.09785: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 28011 1726882532.09805: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.09847: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.09883: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 28011 1726882532.09898: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.10121: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.10353: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 28011 1726882532.10410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 28011 1726882532.10429: stdout chunk (state=3): >>>import '_ast' # <<< 28011 1726882532.10488: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9517830> <<< 28011 1726882532.10497: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.10568: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.10648: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 28011 1726882532.10652: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 28011 1726882532.10664: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 28011 1726882532.10684: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.10726: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.10766: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 28011 1726882532.10770: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.10823: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.10865: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.10923: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.10987: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 28011 1726882532.11027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.11104: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.11109: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9522150> <<< 28011 1726882532.11140: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e951da90> <<< 28011 1726882532.11172: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 28011 1726882532.11181: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.11249: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.11309: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.11341: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.11386: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 28011 1726882532.11395: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.11407: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 28011 1726882532.11431: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 28011 1726882532.11450: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 28011 1726882532.11511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 28011 1726882532.11520: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 28011 1726882532.11541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 28011 1726882532.11599: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e960ab40> <<< 28011 1726882532.11640: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e97ea810> <<< 28011 1726882532.11720: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95222d0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95170e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 28011 1726882532.11729: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.11765: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.11791: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 28011 1726882532.11796: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 28011 1726882532.11845: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 28011 1726882532.11859: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.11877: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.11883: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 28011 1726882532.11897: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.11956: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12017: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12039: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12056: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12106: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12145: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12183: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12217: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 28011 1726882532.12232: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12302: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12371: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12394: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12432: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 28011 1726882532.12439: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12614: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12786: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12827: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.12884: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 28011 1726882532.12892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.12907: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 28011 1726882532.12929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 28011 1726882532.12939: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 28011 1726882532.12967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 28011 1726882532.12983: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95b22a0> <<< 28011 1726882532.13016: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 28011 1726882532.13019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 28011 1726882532.13050: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 28011 1726882532.13084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 28011 1726882532.13115: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 28011 1726882532.13121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 28011 1726882532.13137: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9158200> <<< 28011 1726882532.13161: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.13181: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e91587d0> <<< 28011 1726882532.13236: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95981d0> <<< 28011 1726882532.13248: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95b2e40> <<< 28011 1726882532.13281: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95b0980> <<< 28011 1726882532.13291: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95b05c0> <<< 28011 1726882532.13316: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 28011 1726882532.13353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 28011 1726882532.13378: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 28011 1726882532.13383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 28011 1726882532.13414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 28011 1726882532.13452: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.13458: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e915b4a0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e915ad50> <<< 28011 1726882532.13478: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.13498: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e915af30> <<< 28011 1726882532.13514: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e915a180> <<< 28011 1726882532.13527: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 28011 1726882532.13627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 28011 1726882532.13632: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e915b5f0> <<< 28011 1726882532.13655: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 28011 1726882532.13684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 28011 1726882532.13713: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.13718: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e91a20f0> <<< 28011 1726882532.13748: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e91a0110> <<< 28011 1726882532.13775: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95b06e0> <<< 28011 1726882532.13781: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 28011 1726882532.13791: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 28011 1726882532.13815: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28011 1726882532.13834: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 28011 1726882532.13840: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.13911: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.13952: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 28011 1726882532.13977: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14024: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14078: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 28011 1726882532.14083: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14110: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 28011 1726882532.14122: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14155: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14189: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 28011 1726882532.14196: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14247: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14302: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 28011 1726882532.14311: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14348: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14397: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 28011 1726882532.14400: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14462: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14521: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14579: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.14639: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 28011 1726882532.14642: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 28011 1726882532.14655: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.15130: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.15555: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 28011 1726882532.15569: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.15623: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.15677: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.15710: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.15752: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 28011 1726882532.15756: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 28011 1726882532.15768: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.15802: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.15826: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 28011 1726882532.15844: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.15896: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.15949: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 28011 1726882532.15965: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.16002: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.16030: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 28011 1726882532.16047: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.16074: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.16110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 28011 1726882532.16115: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.16197: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.16281: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 28011 1726882532.16287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 28011 1726882532.16307: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e91a33b0> <<< 28011 1726882532.16335: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 28011 1726882532.16359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 28011 1726882532.16473: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e91a2bd0> import 'ansible.module_utils.facts.system.local' # <<< 28011 1726882532.16492: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.16557: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.16623: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 28011 1726882532.16627: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.16756: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.16841: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 28011 1726882532.16844: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.16898: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.16991: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 28011 1726882532.17006: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.17020: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.17065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 28011 1726882532.17111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 28011 1726882532.17174: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.17230: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e91e62d0> <<< 28011 1726882532.17445: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e91d6ea0> import 'ansible.module_utils.facts.system.python' # <<< 28011 1726882532.17458: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.17519: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.17542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 28011 1726882532.17560: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.17639: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.17722: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.17835: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.17974: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 28011 1726882532.17988: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18035: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18073: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 28011 1726882532.18078: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18126: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18172: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 28011 1726882532.18177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 28011 1726882532.18208: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.18239: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e91fa090> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e91fba70> <<< 28011 1726882532.18242: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # <<< 28011 1726882532.18250: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18269: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 28011 1726882532.18288: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18329: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18371: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 28011 1726882532.18374: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18537: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18684: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 28011 1726882532.18701: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18797: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18898: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18936: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.18977: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 28011 1726882532.19001: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.19020: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.19045: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.19183: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.19327: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 28011 1726882532.19343: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.19468: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.19598: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 28011 1726882532.19707: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.19898: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28011 1726882532.20212: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.20778: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 28011 1726882532.20786: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.20865: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.20977: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 28011 1726882532.20981: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.21081: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.21185: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 28011 1726882532.21188: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.21332: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.21501: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 28011 1726882532.21507: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.21545: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 28011 1726882532.21585: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.21632: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 28011 1726882532.21635: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.21763: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.21826: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22204: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22218: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 28011 1726882532.22243: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 28011 1726882532.22285: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22332: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 28011 1726882532.22346: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22379: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22383: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 28011 1726882532.22402: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22463: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22540: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 28011 1726882532.22571: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22575: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22609: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 28011 1726882532.22612: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22663: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22730: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 28011 1726882532.22739: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22789: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.22859: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 28011 1726882532.22862: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.23122: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.23381: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 28011 1726882532.23455: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.23520: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 28011 1726882532.23560: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.23595: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 28011 1726882532.23636: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28011 1726882532.23667: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 28011 1726882532.23710: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.23713: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.23755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 28011 1726882532.23758: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.23835: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.23941: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 28011 1726882532.23944: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.23977: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 28011 1726882532.24007: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.24056: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 28011 1726882532.24071: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.24088: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.24126: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.24189: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.24213: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.24269: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.24360: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 28011 1726882532.24372: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 28011 1726882532.24423: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.24537: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 28011 1726882532.24687: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.24866: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 28011 1726882532.24876: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.24918: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.24964: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 28011 1726882532.24984: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.25023: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.25081: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 28011 1726882532.25084: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.25164: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.25337: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 28011 1726882532.25341: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.25369: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.25447: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 28011 1726882532.25526: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.26031: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 28011 1726882532.26049: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 28011 1726882532.26097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 28011 1726882532.26101: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e8ff7890> <<< 28011 1726882532.26126: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e8ff5e50> <<< 28011 1726882532.26157: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e8ff4230> <<< 28011 1726882532.26863: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": <<< 28011 1726882532.26889: stdout chunk (state=3): >>>"ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "32", "epoch": "1726882532", "epoch_int": "1726882532", "date": "2024-09-20", "time": "21:35:32", "iso8601_micro": "2024-09-21T01:35:32.266860Z", "iso8601": "2024-09-21T01:35:32Z", "iso8601_basic": "20240920T213532266860", "iso8601_basic_short": "20240920T213532", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28011 1726882532.27449: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib <<< 28011 1726882532.27514: stdout chunk (state=3): >>># cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools <<< 28011 1726882532.27520: stdout chunk (state=3): >>># cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct <<< 28011 1726882532.27565: stdout chunk (state=3): >>># cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress <<< 28011 1726882532.27572: stdout chunk (state=3): >>># cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess <<< 28011 1726882532.27612: stdout chunk (state=3): >>># cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six <<< 28011 1726882532.27668: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 28011 1726882532.27673: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils <<< 28011 1726882532.27768: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios <<< 28011 1726882532.27777: stdout chunk (state=3): >>># cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd <<< 28011 1726882532.27781: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn <<< 28011 1726882532.27815: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 28011 1726882532.28123: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 28011 1726882532.28168: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 28011 1726882532.28200: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 28011 1726882532.28218: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 28011 1726882532.28264: stdout chunk (state=3): >>># destroy ntpath <<< 28011 1726882532.28323: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 28011 1726882532.28326: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 28011 1726882532.28378: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 28011 1726882532.28385: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 28011 1726882532.28431: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 28011 1726882532.28486: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 28011 1726882532.28528: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime <<< 28011 1726882532.28571: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 28011 1726882532.28621: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 28011 1726882532.28628: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 28011 1726882532.28674: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 28011 1726882532.28728: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 28011 1726882532.28763: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct <<< 28011 1726882532.28789: stdout chunk (state=3): >>># cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 28011 1726882532.28839: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 28011 1726882532.28865: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 28011 1726882532.29024: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 28011 1726882532.29030: stdout chunk (state=3): >>># destroy _collections <<< 28011 1726882532.29062: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 28011 1726882532.29131: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 28011 1726882532.29147: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 28011 1726882532.29239: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 28011 1726882532.29288: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 28011 1726882532.29372: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 28011 1726882532.29375: stdout chunk (state=3): >>># clear sys.audit hooks <<< 28011 1726882532.29858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882532.29861: stdout chunk (state=3): >>><<< 28011 1726882532.29864: stderr chunk (state=3): >>><<< 28011 1726882532.29891: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea1684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea137b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea16aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f3d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f3dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f7be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f7bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9fb3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9fb3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f93b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f91280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f79040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9fd37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9fd23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f92150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9fd0c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea008860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f782c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08ea008d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea008bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08ea008f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9f76de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea009610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea0092e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea00a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea020710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08ea021df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea022c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08ea0232f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea0221e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08ea023d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea0234a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea00a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9d17bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9d406e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d40440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9d40710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9d41040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9d419a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d408f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d15d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d42db0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d41af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08ea00ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d6f110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d8f4a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9df0260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9df29c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9df0380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9db9280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9725340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d8e2a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9d43ce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f08e97255b0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_bguwpstq/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e978f0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e976dfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e976d130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e978cf80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e97be9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e97be750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e97be060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e97be7e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e978fd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e97bf710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e97bf950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e97bfe90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9629ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e962b8c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e962c2c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e962d460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e962ff20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9d402f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e962e1e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9637e00> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e96368d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9636630> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9636ba0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e962e6f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e967bf20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e967c200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e967dca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e967da60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9680230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e967e360> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e96839e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e96803e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9684ce0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9684a10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e96843b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e967c3b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9510410> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9511370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9686bd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9687f20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e96867e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9515790> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9516c00> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95115e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9516c90> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9517830> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e9522150> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e951da90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e960ab40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e97ea810> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95222d0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95170e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95b22a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e9158200> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e91587d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95981d0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95b2e40> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95b0980> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95b05c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e915b4a0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e915ad50> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e915af30> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e915a180> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e915b5f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e91a20f0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e91a0110> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e95b06e0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e91a33b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e91a2bd0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e91e62d0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e91d6ea0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e91fa090> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e91fba70> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f08e8ff7890> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e8ff5e50> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f08e8ff4230> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "32", "epoch": "1726882532", "epoch_int": "1726882532", "date": "2024-09-20", "time": "21:35:32", "iso8601_micro": "2024-09-21T01:35:32.266860Z", "iso8601": "2024-09-21T01:35:32Z", "iso8601_basic": "20240920T213532266860", "iso8601_basic_short": "20240920T213532", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 28011 1726882532.30839: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882532.30842: _low_level_execute_command(): starting 28011 1726882532.30848: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882531.8316693-28103-226582408038535/ > /dev/null 2>&1 && sleep 0' 28011 1726882532.30863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882532.30901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882532.30905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882532.30909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882532.30951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882532.32730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882532.32758: stderr chunk (state=3): >>><<< 28011 1726882532.32761: stdout chunk (state=3): >>><<< 28011 1726882532.32770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882532.32798: handler run complete 28011 1726882532.32816: variable 'ansible_facts' from source: unknown 28011 1726882532.32851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.32924: variable 'ansible_facts' from source: unknown 28011 1726882532.32955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.32987: attempt loop complete, returning result 28011 1726882532.32990: _execute() done 28011 1726882532.32996: dumping result to json 28011 1726882532.33006: done dumping result, returning 28011 1726882532.33013: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [12673a56-9f93-962d-7c65-000000000106] 28011 1726882532.33019: sending task result for task 12673a56-9f93-962d-7c65-000000000106 28011 1726882532.33140: done sending task result for task 12673a56-9f93-962d-7c65-000000000106 28011 1726882532.33143: WORKER PROCESS EXITING ok: [managed_node1] 28011 1726882532.33245: no more pending results, returning what we have 28011 1726882532.33248: results queue empty 28011 1726882532.33249: checking for any_errors_fatal 28011 1726882532.33250: done checking for any_errors_fatal 28011 1726882532.33251: checking for max_fail_percentage 28011 1726882532.33254: done checking for max_fail_percentage 28011 1726882532.33254: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.33255: done checking to see if all hosts have failed 28011 1726882532.33256: getting the remaining hosts for this loop 28011 1726882532.33257: done getting the remaining hosts for this loop 28011 1726882532.33260: getting the next task for host managed_node1 28011 1726882532.33267: done getting next task for host managed_node1 28011 1726882532.33269: ^ task is: TASK: Check if system is ostree 28011 1726882532.33271: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.33275: getting variables 28011 1726882532.33276: in VariableManager get_vars() 28011 1726882532.33303: Calling all_inventory to load vars for managed_node1 28011 1726882532.33305: Calling groups_inventory to load vars for managed_node1 28011 1726882532.33308: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.33324: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.33326: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.33329: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.33468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.33580: done with get_vars() 28011 1726882532.33588: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:35:32 -0400 (0:00:00.578) 0:00:01.887 ****** 28011 1726882532.33653: entering _queue_task() for managed_node1/stat 28011 1726882532.33842: worker is 1 (out of 1 available) 28011 1726882532.33856: exiting _queue_task() for managed_node1/stat 28011 1726882532.33867: done queuing things up, now waiting for results queue to drain 28011 1726882532.33868: waiting for pending results... 28011 1726882532.34013: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 28011 1726882532.34078: in run() - task 12673a56-9f93-962d-7c65-000000000108 28011 1726882532.34090: variable 'ansible_search_path' from source: unknown 28011 1726882532.34102: variable 'ansible_search_path' from source: unknown 28011 1726882532.34125: calling self._execute() 28011 1726882532.34176: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.34181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.34189: variable 'omit' from source: magic vars 28011 1726882532.34540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882532.34773: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882532.34809: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882532.34840: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882532.34872: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882532.34972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882532.34999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882532.35103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882532.35106: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882532.35159: Evaluated conditional (not __network_is_ostree is defined): True 28011 1726882532.35162: variable 'omit' from source: magic vars 28011 1726882532.35204: variable 'omit' from source: magic vars 28011 1726882532.35238: variable 'omit' from source: magic vars 28011 1726882532.35261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882532.35287: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882532.35321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882532.35429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882532.35435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882532.35438: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882532.35440: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.35443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.35463: Set connection var ansible_connection to ssh 28011 1726882532.35471: Set connection var ansible_pipelining to False 28011 1726882532.35477: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882532.35483: Set connection var ansible_shell_executable to /bin/sh 28011 1726882532.35490: Set connection var ansible_timeout to 10 28011 1726882532.35500: Set connection var ansible_shell_type to sh 28011 1726882532.35538: variable 'ansible_shell_executable' from source: unknown 28011 1726882532.35541: variable 'ansible_connection' from source: unknown 28011 1726882532.35544: variable 'ansible_module_compression' from source: unknown 28011 1726882532.35546: variable 'ansible_shell_type' from source: unknown 28011 1726882532.35548: variable 'ansible_shell_executable' from source: unknown 28011 1726882532.35550: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.35601: variable 'ansible_pipelining' from source: unknown 28011 1726882532.35604: variable 'ansible_timeout' from source: unknown 28011 1726882532.35607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.35680: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882532.35690: variable 'omit' from source: magic vars 28011 1726882532.35699: starting attempt loop 28011 1726882532.35702: running the handler 28011 1726882532.35757: _low_level_execute_command(): starting 28011 1726882532.35760: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882532.36358: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882532.36367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882532.36378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882532.36392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882532.36410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882532.36425: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882532.36428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882532.36474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882532.36478: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882532.36480: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28011 1726882532.36483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882532.36485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882532.36488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882532.36490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882532.36492: stderr chunk (state=3): >>>debug2: match found <<< 28011 1726882532.36506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882532.36584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882532.36587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882532.36598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882532.36666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882532.38230: stdout chunk (state=3): >>>/root <<< 28011 1726882532.38329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882532.38400: stderr chunk (state=3): >>><<< 28011 1726882532.38406: stdout chunk (state=3): >>><<< 28011 1726882532.38409: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882532.38418: _low_level_execute_command(): starting 28011 1726882532.38424: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781 `" && echo ansible-tmp-1726882532.3838449-28128-184989492690781="` echo /root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781 `" ) && sleep 0' 28011 1726882532.38999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882532.39002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882532.39005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882532.39007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882532.39020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882532.39082: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882532.39085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882532.39088: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882532.39090: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882532.39092: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28011 1726882532.39096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882532.39098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882532.39099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882532.39101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882532.39110: stderr chunk (state=3): >>>debug2: match found <<< 28011 1726882532.39119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882532.39184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882532.39222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882532.39225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882532.39284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882532.41112: stdout chunk (state=3): >>>ansible-tmp-1726882532.3838449-28128-184989492690781=/root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781 <<< 28011 1726882532.41273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882532.41276: stdout chunk (state=3): >>><<< 28011 1726882532.41279: stderr chunk (state=3): >>><<< 28011 1726882532.41306: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882532.3838449-28128-184989492690781=/root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882532.41498: variable 'ansible_module_compression' from source: unknown 28011 1726882532.41501: ANSIBALLZ: Using lock for stat 28011 1726882532.41503: ANSIBALLZ: Acquiring lock 28011 1726882532.41505: ANSIBALLZ: Lock acquired: 139767565767920 28011 1726882532.41507: ANSIBALLZ: Creating module 28011 1726882532.52502: ANSIBALLZ: Writing module into payload 28011 1726882532.52609: ANSIBALLZ: Writing module 28011 1726882532.52633: ANSIBALLZ: Renaming module 28011 1726882532.52651: ANSIBALLZ: Done creating module 28011 1726882532.52670: variable 'ansible_facts' from source: unknown 28011 1726882532.52764: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781/AnsiballZ_stat.py 28011 1726882532.52864: Sending initial data 28011 1726882532.52868: Sent initial data (153 bytes) 28011 1726882532.53505: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882532.53549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882532.53601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882532.55103: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28011 1726882532.55107: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882532.55142: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882532.55183: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp6gticqng /root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781/AnsiballZ_stat.py <<< 28011 1726882532.55186: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781/AnsiballZ_stat.py" <<< 28011 1726882532.55226: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp6gticqng" to remote "/root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781/AnsiballZ_stat.py" <<< 28011 1726882532.55809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882532.55976: stderr chunk (state=3): >>><<< 28011 1726882532.55980: stdout chunk (state=3): >>><<< 28011 1726882532.55982: done transferring module to remote 28011 1726882532.55985: _low_level_execute_command(): starting 28011 1726882532.55987: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781/ /root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781/AnsiballZ_stat.py && sleep 0' 28011 1726882532.56519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882532.56522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882532.56525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882532.56528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882532.56530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882532.56532: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882532.56538: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28011 1726882532.56553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882532.56570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882532.56589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882532.56639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882532.56651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882532.56703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882532.58413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882532.58431: stderr chunk (state=3): >>><<< 28011 1726882532.58434: stdout chunk (state=3): >>><<< 28011 1726882532.58446: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882532.58449: _low_level_execute_command(): starting 28011 1726882532.58454: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781/AnsiballZ_stat.py && sleep 0' 28011 1726882532.58843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882532.58846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882532.58849: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882532.58851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882532.58900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882532.58904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882532.58953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882532.61048: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 28011 1726882532.61080: stdout chunk (state=3): >>>import _imp # builtin <<< 28011 1726882532.61117: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 28011 1726882532.61185: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 28011 1726882532.61231: stdout chunk (state=3): >>>import 'posix' # <<< 28011 1726882532.61260: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 28011 1726882532.61267: stdout chunk (state=3): >>># installing zipimport hook <<< 28011 1726882532.61282: stdout chunk (state=3): >>>import 'time' # <<< 28011 1726882532.61302: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 28011 1726882532.61348: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 28011 1726882532.61353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.61369: stdout chunk (state=3): >>>import '_codecs' # <<< 28011 1726882532.61400: stdout chunk (state=3): >>>import 'codecs' # <<< 28011 1726882532.61431: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 28011 1726882532.61463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd2e84d0> <<< 28011 1726882532.61471: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd2b7b30> <<< 28011 1726882532.61506: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 28011 1726882532.61512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd2eaa50> <<< 28011 1726882532.61538: stdout chunk (state=3): >>>import '_signal' # <<< 28011 1726882532.61562: stdout chunk (state=3): >>>import '_abc' # <<< 28011 1726882532.61569: stdout chunk (state=3): >>>import 'abc' # <<< 28011 1726882532.61583: stdout chunk (state=3): >>>import 'io' # <<< 28011 1726882532.61620: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 28011 1726882532.61710: stdout chunk (state=3): >>>import '_collections_abc' # <<< 28011 1726882532.61735: stdout chunk (state=3): >>>import 'genericpath' # <<< 28011 1726882532.61738: stdout chunk (state=3): >>>import 'posixpath' # <<< 28011 1726882532.61771: stdout chunk (state=3): >>>import 'os' # <<< 28011 1726882532.61788: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 28011 1726882532.61796: stdout chunk (state=3): >>>Processing user site-packages <<< 28011 1726882532.61819: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 28011 1726882532.61822: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 28011 1726882532.61832: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 28011 1726882532.61861: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 28011 1726882532.61891: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd099130> <<< 28011 1726882532.61944: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 28011 1726882532.61961: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.61964: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd099fa0> <<< 28011 1726882532.62000: stdout chunk (state=3): >>>import 'site' # <<< 28011 1726882532.62022: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28011 1726882532.62253: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 28011 1726882532.62259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 28011 1726882532.62289: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 28011 1726882532.62303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.62320: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 28011 1726882532.62361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 28011 1726882532.62377: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 28011 1726882532.62411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 28011 1726882532.62426: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0d7e60> <<< 28011 1726882532.62443: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 28011 1726882532.62455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 28011 1726882532.62482: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0d7f20> <<< 28011 1726882532.62509: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 28011 1726882532.62536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 28011 1726882532.62560: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 28011 1726882532.62607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.62628: stdout chunk (state=3): >>>import 'itertools' # <<< 28011 1726882532.62654: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 28011 1726882532.62660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd10f890> <<< 28011 1726882532.62681: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 28011 1726882532.62684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 28011 1726882532.62704: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd10ff20> <<< 28011 1726882532.62715: stdout chunk (state=3): >>>import '_collections' # <<< 28011 1726882532.62768: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0efb30> <<< 28011 1726882532.62770: stdout chunk (state=3): >>>import '_functools' # <<< 28011 1726882532.62808: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0ed250> <<< 28011 1726882532.62900: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0d5010> <<< 28011 1726882532.62924: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 28011 1726882532.62938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 28011 1726882532.62972: stdout chunk (state=3): >>>import '_sre' # <<< 28011 1726882532.62977: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 28011 1726882532.63005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 28011 1726882532.63024: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 28011 1726882532.63037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 28011 1726882532.63069: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd12f800> <<< 28011 1726882532.63084: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd12e450> <<< 28011 1726882532.63111: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 28011 1726882532.63115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0ee120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd12ccb0> <<< 28011 1726882532.63176: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 28011 1726882532.63187: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd164860> <<< 28011 1726882532.63196: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0d4290> <<< 28011 1726882532.63218: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 28011 1726882532.63248: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cd164d10> <<< 28011 1726882532.63261: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd164bc0> <<< 28011 1726882532.63308: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.63314: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cd164fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0d2db0> <<< 28011 1726882532.63349: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 28011 1726882532.63354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.63369: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 28011 1726882532.63416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 28011 1726882532.63419: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd1656a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd165370> <<< 28011 1726882532.63431: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 28011 1726882532.63464: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 28011 1726882532.63476: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd1665a0> <<< 28011 1726882532.63503: stdout chunk (state=3): >>>import 'importlib.util' # <<< 28011 1726882532.63509: stdout chunk (state=3): >>>import 'runpy' # <<< 28011 1726882532.63534: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 28011 1726882532.63567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 28011 1726882532.63589: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 28011 1726882532.63600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 28011 1726882532.63609: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd17c7a0> <<< 28011 1726882532.63626: stdout chunk (state=3): >>>import 'errno' # <<< 28011 1726882532.63652: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.63664: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cd17de80> <<< 28011 1726882532.63679: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 28011 1726882532.63692: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 28011 1726882532.63718: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 28011 1726882532.63723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 28011 1726882532.63740: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd17ed20> <<< 28011 1726882532.63779: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.63787: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cd17f320> <<< 28011 1726882532.63803: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd17e270> <<< 28011 1726882532.63806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 28011 1726882532.63827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 28011 1726882532.63862: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.63879: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cd17fda0> <<< 28011 1726882532.63882: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd17f4d0> <<< 28011 1726882532.63928: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd166510> <<< 28011 1726882532.63954: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 28011 1726882532.63974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 28011 1726882532.64000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 28011 1726882532.64015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 28011 1726882532.64050: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ccf13bf0> <<< 28011 1726882532.64075: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 28011 1726882532.64110: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ccf3c740> <<< 28011 1726882532.64116: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf3c4a0> <<< 28011 1726882532.64142: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.64147: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ccf3c680> <<< 28011 1726882532.64183: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 28011 1726882532.64254: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.64382: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ccf3cfe0><<< 28011 1726882532.64389: stdout chunk (state=3): >>> <<< 28011 1726882532.64502: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.64505: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ccf3d910> <<< 28011 1726882532.64514: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf3c8c0> <<< 28011 1726882532.64524: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf11d90> <<< 28011 1726882532.64546: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 28011 1726882532.64563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 28011 1726882532.64594: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 28011 1726882532.64607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 28011 1726882532.64614: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf3ed20> <<< 28011 1726882532.64635: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf3da60> <<< 28011 1726882532.64656: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd166750> <<< 28011 1726882532.64685: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py<<< 28011 1726882532.64691: stdout chunk (state=3): >>> <<< 28011 1726882532.64747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.64769: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 28011 1726882532.64803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 28011 1726882532.64835: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf67080> <<< 28011 1726882532.64882: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 28011 1726882532.64902: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.64918: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 28011 1726882532.64945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 28011 1726882532.64981: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf8b440> <<< 28011 1726882532.65008: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 28011 1726882532.65051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 28011 1726882532.65109: stdout chunk (state=3): >>>import 'ntpath' # <<< 28011 1726882532.65126: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 28011 1726882532.65136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccfec230> <<< 28011 1726882532.65145: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 28011 1726882532.65179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 28011 1726882532.65205: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 28011 1726882532.65246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 28011 1726882532.65327: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccfee990> <<< 28011 1726882532.65406: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccfec350> <<< 28011 1726882532.65438: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccfb9250> <<< 28011 1726882532.65466: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc925310> <<< 28011 1726882532.65492: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf8a240> <<< 28011 1726882532.65497: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf3fc50> <<< 28011 1726882532.65608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 28011 1726882532.65619: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f36cc9255b0> <<< 28011 1726882532.65768: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_kzvgyoqz/ansible_stat_payload.zip' <<< 28011 1726882532.65772: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.65907: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.65929: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 28011 1726882532.65944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 28011 1726882532.65986: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 28011 1726882532.66060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 28011 1726882532.66090: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 28011 1726882532.66106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc97afc0> <<< 28011 1726882532.66112: stdout chunk (state=3): >>>import '_typing' # <<< 28011 1726882532.66288: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc959eb0> <<< 28011 1726882532.66304: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9590a0> # zipimport: zlib available <<< 28011 1726882532.66332: stdout chunk (state=3): >>>import 'ansible' # <<< 28011 1726882532.66350: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.66361: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.66383: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.66390: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 28011 1726882532.66413: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.67786: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.68891: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9792b0> <<< 28011 1726882532.68922: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 28011 1726882532.68927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.68948: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 28011 1726882532.68967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 28011 1726882532.68982: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 28011 1726882532.68986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 28011 1726882532.69016: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.69022: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc9a6960> <<< 28011 1726882532.69051: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9a66f0> <<< 28011 1726882532.69080: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9a6030> <<< 28011 1726882532.69106: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 28011 1726882532.69112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 28011 1726882532.69154: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9a6480> <<< 28011 1726882532.69157: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc97bc50> import 'atexit' # <<< 28011 1726882532.69190: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc9a7710> <<< 28011 1726882532.69231: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.69237: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc9a7950> <<< 28011 1726882532.69250: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 28011 1726882532.69292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 28011 1726882532.69305: stdout chunk (state=3): >>>import '_locale' # <<< 28011 1726882532.69353: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9a7e90> <<< 28011 1726882532.69358: stdout chunk (state=3): >>>import 'pwd' # <<< 28011 1726882532.69381: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 28011 1726882532.69406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 28011 1726882532.69443: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc80dc10> <<< 28011 1726882532.69468: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.69475: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc80f800> <<< 28011 1726882532.69495: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 28011 1726882532.69510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 28011 1726882532.69545: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8101d0> <<< 28011 1726882532.69569: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 28011 1726882532.69597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 28011 1726882532.69613: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc811370> <<< 28011 1726882532.69634: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 28011 1726882532.69671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 28011 1726882532.69695: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 28011 1726882532.69698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 28011 1726882532.69752: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc813e30> <<< 28011 1726882532.69787: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.69796: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc9787d0> <<< 28011 1726882532.69806: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8120f0> <<< 28011 1726882532.69828: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 28011 1726882532.69855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 28011 1726882532.69876: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 28011 1726882532.69905: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 28011 1726882532.69931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 28011 1726882532.69957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 28011 1726882532.69975: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc81be30> <<< 28011 1726882532.69981: stdout chunk (state=3): >>>import '_tokenize' # <<< 28011 1726882532.70052: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc81a900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc81a660> <<< 28011 1726882532.70076: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 28011 1726882532.70079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 28011 1726882532.70157: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc81abd0> <<< 28011 1726882532.70184: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc812600> <<< 28011 1726882532.70214: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc863e90> <<< 28011 1726882532.70243: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8641a0> <<< 28011 1726882532.70271: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 28011 1726882532.70281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 28011 1726882532.70309: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 28011 1726882532.70347: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.70352: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc865c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc865a00> <<< 28011 1726882532.70372: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 28011 1726882532.70464: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 28011 1726882532.70515: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc868200> <<< 28011 1726882532.70519: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc866330> <<< 28011 1726882532.70544: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 28011 1726882532.70580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.70607: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 28011 1726882532.70615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 28011 1726882532.70619: stdout chunk (state=3): >>>import '_string' # <<< 28011 1726882532.70667: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc86b9e0> <<< 28011 1726882532.71025: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8683b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc86ca70> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc86cb00> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc86cd10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc864350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 28011 1726882532.71052: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 28011 1726882532.71055: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc8f4470> <<< 28011 1726882532.71233: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc8f55b0> <<< 28011 1726882532.71265: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc86ec30> <<< 28011 1726882532.71295: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc86ff80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc86e810> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 28011 1726882532.71317: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.71401: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.71478: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.71537: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 28011 1726882532.71541: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.71543: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 28011 1726882532.71572: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.71675: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.71795: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.72318: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.73013: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc8fd8e0> <<< 28011 1726882532.73054: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 28011 1726882532.73070: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8fed80> <<< 28011 1726882532.73073: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8f5760> <<< 28011 1726882532.73118: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 28011 1726882532.73151: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.73170: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 28011 1726882532.73184: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.73319: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.73483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 28011 1726882532.73512: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8fede0> <<< 28011 1726882532.73524: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.73943: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.74380: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.74451: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.74530: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 28011 1726882532.74537: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.74575: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.74613: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 28011 1726882532.74622: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.74686: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.74772: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 28011 1726882532.74778: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.74805: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.74815: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 28011 1726882532.74824: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.74860: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.74906: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 28011 1726882532.74908: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.75133: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.75361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 28011 1726882532.75420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 28011 1726882532.75424: stdout chunk (state=3): >>>import '_ast' # <<< 28011 1726882532.75494: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8ffa10> <<< 28011 1726882532.75508: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.75577: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.75653: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 28011 1726882532.75664: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 28011 1726882532.75670: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 28011 1726882532.75687: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.75735: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.75773: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 28011 1726882532.75788: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.75829: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.75874: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.75932: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.76001: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 28011 1726882532.76032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.76111: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc70a180> <<< 28011 1726882532.76150: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc7074d0> <<< 28011 1726882532.76177: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 28011 1726882532.76188: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.76263: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.76322: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.76346: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.76395: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 28011 1726882532.76417: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 28011 1726882532.76433: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 28011 1726882532.76452: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 28011 1726882532.76508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 28011 1726882532.76530: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 28011 1726882532.76540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 28011 1726882532.76597: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9fe960> <<< 28011 1726882532.76635: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9ee630> <<< 28011 1726882532.76718: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8ff200> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8fe5d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 28011 1726882532.76724: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.76751: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.76778: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 28011 1726882532.76783: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 28011 1726882532.76837: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 28011 1726882532.76842: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.76862: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 28011 1726882532.76871: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.77003: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.77180: stdout chunk (state=3): >>># zipimport: zlib available <<< 28011 1726882532.77297: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 28011 1726882532.77568: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path <<< 28011 1726882532.77582: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type<<< 28011 1726882532.77587: stdout chunk (state=3): >>> # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp <<< 28011 1726882532.77610: stdout chunk (state=3): >>># cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig <<< 28011 1726882532.77645: stdout chunk (state=3): >>># cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy <<< 28011 1726882532.77649: stdout chunk (state=3): >>># destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils <<< 28011 1726882532.77662: stdout chunk (state=3): >>># cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl<<< 28011 1726882532.77681: stdout chunk (state=3): >>> # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string <<< 28011 1726882532.77719: stdout chunk (state=3): >>># cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy <<< 28011 1726882532.77722: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 28011 1726882532.77959: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 28011 1726882532.77972: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 28011 1726882532.77978: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 28011 1726882532.78006: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 28011 1726882532.78035: stdout chunk (state=3): >>># destroy ntpath <<< 28011 1726882532.78056: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 28011 1726882532.78068: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 28011 1726882532.78089: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 28011 1726882532.78112: stdout chunk (state=3): >>># destroy selectors # destroy errno <<< 28011 1726882532.78125: stdout chunk (state=3): >>># destroy array # destroy datetime # destroy selinux <<< 28011 1726882532.78136: stdout chunk (state=3): >>># destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 28011 1726882532.78182: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 28011 1726882532.78187: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 28011 1726882532.78192: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache <<< 28011 1726882532.78216: stdout chunk (state=3): >>># destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 28011 1726882532.78236: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 28011 1726882532.78245: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 28011 1726882532.78258: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 28011 1726882532.78273: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 28011 1726882532.78276: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 28011 1726882532.78283: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 28011 1726882532.78299: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 28011 1726882532.78306: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 28011 1726882532.78428: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 28011 1726882532.78458: stdout chunk (state=3): >>># destroy _collections # destroy platform<<< 28011 1726882532.78462: stdout chunk (state=3): >>> # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 28011 1726882532.78480: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 28011 1726882532.78517: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 28011 1726882532.78520: stdout chunk (state=3): >>># destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 28011 1726882532.78545: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 28011 1726882532.78626: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 28011 1726882532.78645: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 28011 1726882532.78655: stdout chunk (state=3): >>># destroy time <<< 28011 1726882532.78674: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 28011 1726882532.78684: stdout chunk (state=3): >>># destroy _hashlib <<< 28011 1726882532.78709: stdout chunk (state=3): >>># destroy _operator # destroy _string # destroy re <<< 28011 1726882532.78725: stdout chunk (state=3): >>># destroy itertools <<< 28011 1726882532.78730: stdout chunk (state=3): >>># destroy _abc <<< 28011 1726882532.78740: stdout chunk (state=3): >>># destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 28011 1726882532.78743: stdout chunk (state=3): >>># clear sys.audit hooks <<< 28011 1726882532.79069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882532.79107: stderr chunk (state=3): >>><<< 28011 1726882532.79110: stdout chunk (state=3): >>><<< 28011 1726882532.79170: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd2e84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd2b7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd2eaa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd099130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd099fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0d7e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0d7f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd10f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd10ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0efb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0ed250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0d5010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd12f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd12e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0ee120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd12ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd164860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0d4290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cd164d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd164bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cd164fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd0d2db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd1656a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd165370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd1665a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd17c7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cd17de80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd17ed20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cd17f320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd17e270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cd17fda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd17f4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd166510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ccf13bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ccf3c740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf3c4a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ccf3c680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ccf3cfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ccf3d910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf3c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf11d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf3ed20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf3da60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cd166750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf67080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf8b440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccfec230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccfee990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccfec350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccfb9250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc925310> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf8a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ccf3fc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f36cc9255b0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_kzvgyoqz/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc97afc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc959eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9590a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9792b0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc9a6960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9a66f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9a6030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9a6480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc97bc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc9a7710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc9a7950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9a7e90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc80dc10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc80f800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8101d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc811370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc813e30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc9787d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8120f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc81be30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc81a900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc81a660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc81abd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc812600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc863e90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8641a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc865c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc865a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc868200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc866330> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc86b9e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8683b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc86ca70> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc86cb00> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc86cd10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc864350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc8f4470> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc8f55b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc86ec30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc86ff80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc86e810> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc8fd8e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8fed80> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8f5760> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8fede0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8ffa10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36cc70a180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc7074d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9fe960> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc9ee630> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8ff200> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36cc8fe5d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 28011 1726882532.79701: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882532.79704: _low_level_execute_command(): starting 28011 1726882532.79707: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882532.3838449-28128-184989492690781/ > /dev/null 2>&1 && sleep 0' 28011 1726882532.79840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882532.79844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882532.79846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882532.79848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882532.79850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882532.79852: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882532.79885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882532.79904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882532.79952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882532.81736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882532.81757: stderr chunk (state=3): >>><<< 28011 1726882532.81761: stdout chunk (state=3): >>><<< 28011 1726882532.81777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882532.81782: handler run complete 28011 1726882532.81802: attempt loop complete, returning result 28011 1726882532.81805: _execute() done 28011 1726882532.81807: dumping result to json 28011 1726882532.81809: done dumping result, returning 28011 1726882532.81818: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [12673a56-9f93-962d-7c65-000000000108] 28011 1726882532.81821: sending task result for task 12673a56-9f93-962d-7c65-000000000108 28011 1726882532.81904: done sending task result for task 12673a56-9f93-962d-7c65-000000000108 28011 1726882532.81907: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 28011 1726882532.81981: no more pending results, returning what we have 28011 1726882532.81984: results queue empty 28011 1726882532.81985: checking for any_errors_fatal 28011 1726882532.81991: done checking for any_errors_fatal 28011 1726882532.81992: checking for max_fail_percentage 28011 1726882532.81995: done checking for max_fail_percentage 28011 1726882532.81996: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.81997: done checking to see if all hosts have failed 28011 1726882532.81997: getting the remaining hosts for this loop 28011 1726882532.81998: done getting the remaining hosts for this loop 28011 1726882532.82002: getting the next task for host managed_node1 28011 1726882532.82007: done getting next task for host managed_node1 28011 1726882532.82009: ^ task is: TASK: Set flag to indicate system is ostree 28011 1726882532.82011: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.82014: getting variables 28011 1726882532.82018: in VariableManager get_vars() 28011 1726882532.82046: Calling all_inventory to load vars for managed_node1 28011 1726882532.82048: Calling groups_inventory to load vars for managed_node1 28011 1726882532.82051: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.82061: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.82063: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.82066: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.82219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.82336: done with get_vars() 28011 1726882532.82344: done getting variables 28011 1726882532.82416: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:35:32 -0400 (0:00:00.487) 0:00:02.375 ****** 28011 1726882532.82438: entering _queue_task() for managed_node1/set_fact 28011 1726882532.82439: Creating lock for set_fact 28011 1726882532.82635: worker is 1 (out of 1 available) 28011 1726882532.82648: exiting _queue_task() for managed_node1/set_fact 28011 1726882532.82659: done queuing things up, now waiting for results queue to drain 28011 1726882532.82661: waiting for pending results... 28011 1726882532.82805: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 28011 1726882532.82865: in run() - task 12673a56-9f93-962d-7c65-000000000109 28011 1726882532.82875: variable 'ansible_search_path' from source: unknown 28011 1726882532.82878: variable 'ansible_search_path' from source: unknown 28011 1726882532.82910: calling self._execute() 28011 1726882532.82963: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.82966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.82975: variable 'omit' from source: magic vars 28011 1726882532.83346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882532.83507: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882532.83541: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882532.83566: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882532.83595: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882532.83654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882532.83672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882532.83692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882532.83710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882532.83797: Evaluated conditional (not __network_is_ostree is defined): True 28011 1726882532.83800: variable 'omit' from source: magic vars 28011 1726882532.83823: variable 'omit' from source: magic vars 28011 1726882532.83902: variable '__ostree_booted_stat' from source: set_fact 28011 1726882532.83936: variable 'omit' from source: magic vars 28011 1726882532.83954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882532.83973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882532.83995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882532.84005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882532.84014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882532.84036: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882532.84039: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.84042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.84110: Set connection var ansible_connection to ssh 28011 1726882532.84113: Set connection var ansible_pipelining to False 28011 1726882532.84120: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882532.84124: Set connection var ansible_shell_executable to /bin/sh 28011 1726882532.84131: Set connection var ansible_timeout to 10 28011 1726882532.84136: Set connection var ansible_shell_type to sh 28011 1726882532.84152: variable 'ansible_shell_executable' from source: unknown 28011 1726882532.84155: variable 'ansible_connection' from source: unknown 28011 1726882532.84157: variable 'ansible_module_compression' from source: unknown 28011 1726882532.84160: variable 'ansible_shell_type' from source: unknown 28011 1726882532.84162: variable 'ansible_shell_executable' from source: unknown 28011 1726882532.84164: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.84169: variable 'ansible_pipelining' from source: unknown 28011 1726882532.84171: variable 'ansible_timeout' from source: unknown 28011 1726882532.84175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.84245: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882532.84253: variable 'omit' from source: magic vars 28011 1726882532.84257: starting attempt loop 28011 1726882532.84260: running the handler 28011 1726882532.84269: handler run complete 28011 1726882532.84277: attempt loop complete, returning result 28011 1726882532.84279: _execute() done 28011 1726882532.84281: dumping result to json 28011 1726882532.84284: done dumping result, returning 28011 1726882532.84292: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [12673a56-9f93-962d-7c65-000000000109] 28011 1726882532.84297: sending task result for task 12673a56-9f93-962d-7c65-000000000109 28011 1726882532.84372: done sending task result for task 12673a56-9f93-962d-7c65-000000000109 28011 1726882532.84374: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 28011 1726882532.84426: no more pending results, returning what we have 28011 1726882532.84429: results queue empty 28011 1726882532.84430: checking for any_errors_fatal 28011 1726882532.84436: done checking for any_errors_fatal 28011 1726882532.84437: checking for max_fail_percentage 28011 1726882532.84438: done checking for max_fail_percentage 28011 1726882532.84439: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.84440: done checking to see if all hosts have failed 28011 1726882532.84440: getting the remaining hosts for this loop 28011 1726882532.84441: done getting the remaining hosts for this loop 28011 1726882532.84444: getting the next task for host managed_node1 28011 1726882532.84451: done getting next task for host managed_node1 28011 1726882532.84453: ^ task is: TASK: Fix CentOS6 Base repo 28011 1726882532.84455: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.84459: getting variables 28011 1726882532.84460: in VariableManager get_vars() 28011 1726882532.84482: Calling all_inventory to load vars for managed_node1 28011 1726882532.84484: Calling groups_inventory to load vars for managed_node1 28011 1726882532.84486: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.84498: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.84500: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.84507: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.84647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.84762: done with get_vars() 28011 1726882532.84768: done getting variables 28011 1726882532.84853: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:35:32 -0400 (0:00:00.024) 0:00:02.400 ****** 28011 1726882532.84873: entering _queue_task() for managed_node1/copy 28011 1726882532.85056: worker is 1 (out of 1 available) 28011 1726882532.85067: exiting _queue_task() for managed_node1/copy 28011 1726882532.85078: done queuing things up, now waiting for results queue to drain 28011 1726882532.85079: waiting for pending results... 28011 1726882532.85218: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 28011 1726882532.85269: in run() - task 12673a56-9f93-962d-7c65-00000000010b 28011 1726882532.85279: variable 'ansible_search_path' from source: unknown 28011 1726882532.85285: variable 'ansible_search_path' from source: unknown 28011 1726882532.85315: calling self._execute() 28011 1726882532.85367: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.85370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.85378: variable 'omit' from source: magic vars 28011 1726882532.85698: variable 'ansible_distribution' from source: facts 28011 1726882532.85713: Evaluated conditional (ansible_distribution == 'CentOS'): True 28011 1726882532.85798: variable 'ansible_distribution_major_version' from source: facts 28011 1726882532.85803: Evaluated conditional (ansible_distribution_major_version == '6'): False 28011 1726882532.85807: when evaluation is False, skipping this task 28011 1726882532.85809: _execute() done 28011 1726882532.85812: dumping result to json 28011 1726882532.85814: done dumping result, returning 28011 1726882532.85820: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [12673a56-9f93-962d-7c65-00000000010b] 28011 1726882532.85825: sending task result for task 12673a56-9f93-962d-7c65-00000000010b skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 28011 1726882532.85969: no more pending results, returning what we have 28011 1726882532.85972: results queue empty 28011 1726882532.85972: checking for any_errors_fatal 28011 1726882532.85976: done checking for any_errors_fatal 28011 1726882532.85977: checking for max_fail_percentage 28011 1726882532.85978: done checking for max_fail_percentage 28011 1726882532.85979: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.85979: done checking to see if all hosts have failed 28011 1726882532.85980: getting the remaining hosts for this loop 28011 1726882532.85981: done getting the remaining hosts for this loop 28011 1726882532.85984: getting the next task for host managed_node1 28011 1726882532.85988: done getting next task for host managed_node1 28011 1726882532.85994: ^ task is: TASK: Include the task 'enable_epel.yml' 28011 1726882532.85997: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.86000: getting variables 28011 1726882532.86001: in VariableManager get_vars() 28011 1726882532.86026: Calling all_inventory to load vars for managed_node1 28011 1726882532.86029: Calling groups_inventory to load vars for managed_node1 28011 1726882532.86031: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.86040: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.86042: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.86045: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.86151: done sending task result for task 12673a56-9f93-962d-7c65-00000000010b 28011 1726882532.86155: WORKER PROCESS EXITING 28011 1726882532.86160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.86289: done with get_vars() 28011 1726882532.86299: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:35:32 -0400 (0:00:00.014) 0:00:02.414 ****** 28011 1726882532.86353: entering _queue_task() for managed_node1/include_tasks 28011 1726882532.86524: worker is 1 (out of 1 available) 28011 1726882532.86536: exiting _queue_task() for managed_node1/include_tasks 28011 1726882532.86546: done queuing things up, now waiting for results queue to drain 28011 1726882532.86547: waiting for pending results... 28011 1726882532.86676: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 28011 1726882532.86735: in run() - task 12673a56-9f93-962d-7c65-00000000010c 28011 1726882532.86745: variable 'ansible_search_path' from source: unknown 28011 1726882532.86748: variable 'ansible_search_path' from source: unknown 28011 1726882532.86774: calling self._execute() 28011 1726882532.86829: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.86833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.86841: variable 'omit' from source: magic vars 28011 1726882532.87163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882532.88723: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882532.88769: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882532.88795: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882532.88830: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882532.88850: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882532.88909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882532.88929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882532.88946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882532.88973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882532.88985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882532.89062: variable '__network_is_ostree' from source: set_fact 28011 1726882532.89075: Evaluated conditional (not __network_is_ostree | d(false)): True 28011 1726882532.89079: _execute() done 28011 1726882532.89082: dumping result to json 28011 1726882532.89084: done dumping result, returning 28011 1726882532.89095: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [12673a56-9f93-962d-7c65-00000000010c] 28011 1726882532.89098: sending task result for task 12673a56-9f93-962d-7c65-00000000010c 28011 1726882532.89177: done sending task result for task 12673a56-9f93-962d-7c65-00000000010c 28011 1726882532.89180: WORKER PROCESS EXITING 28011 1726882532.89209: no more pending results, returning what we have 28011 1726882532.89214: in VariableManager get_vars() 28011 1726882532.89245: Calling all_inventory to load vars for managed_node1 28011 1726882532.89248: Calling groups_inventory to load vars for managed_node1 28011 1726882532.89251: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.89261: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.89263: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.89266: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.89419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.89529: done with get_vars() 28011 1726882532.89535: variable 'ansible_search_path' from source: unknown 28011 1726882532.89536: variable 'ansible_search_path' from source: unknown 28011 1726882532.89558: we have included files to process 28011 1726882532.89559: generating all_blocks data 28011 1726882532.89560: done generating all_blocks data 28011 1726882532.89564: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28011 1726882532.89565: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28011 1726882532.89566: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28011 1726882532.90016: done processing included file 28011 1726882532.90018: iterating over new_blocks loaded from include file 28011 1726882532.90019: in VariableManager get_vars() 28011 1726882532.90027: done with get_vars() 28011 1726882532.90028: filtering new block on tags 28011 1726882532.90043: done filtering new block on tags 28011 1726882532.90045: in VariableManager get_vars() 28011 1726882532.90052: done with get_vars() 28011 1726882532.90053: filtering new block on tags 28011 1726882532.90059: done filtering new block on tags 28011 1726882532.90060: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 28011 1726882532.90064: extending task lists for all hosts with included blocks 28011 1726882532.90122: done extending task lists 28011 1726882532.90123: done processing included files 28011 1726882532.90124: results queue empty 28011 1726882532.90124: checking for any_errors_fatal 28011 1726882532.90127: done checking for any_errors_fatal 28011 1726882532.90127: checking for max_fail_percentage 28011 1726882532.90128: done checking for max_fail_percentage 28011 1726882532.90128: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.90129: done checking to see if all hosts have failed 28011 1726882532.90129: getting the remaining hosts for this loop 28011 1726882532.90130: done getting the remaining hosts for this loop 28011 1726882532.90131: getting the next task for host managed_node1 28011 1726882532.90133: done getting next task for host managed_node1 28011 1726882532.90135: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 28011 1726882532.90136: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.90139: getting variables 28011 1726882532.90140: in VariableManager get_vars() 28011 1726882532.90146: Calling all_inventory to load vars for managed_node1 28011 1726882532.90147: Calling groups_inventory to load vars for managed_node1 28011 1726882532.90149: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.90152: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.90157: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.90159: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.90348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.90456: done with get_vars() 28011 1726882532.90462: done getting variables 28011 1726882532.90511: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 28011 1726882532.90644: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:35:32 -0400 (0:00:00.043) 0:00:02.458 ****** 28011 1726882532.90674: entering _queue_task() for managed_node1/command 28011 1726882532.90675: Creating lock for command 28011 1726882532.90883: worker is 1 (out of 1 available) 28011 1726882532.90900: exiting _queue_task() for managed_node1/command 28011 1726882532.90910: done queuing things up, now waiting for results queue to drain 28011 1726882532.90912: waiting for pending results... 28011 1726882532.91049: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 28011 1726882532.91125: in run() - task 12673a56-9f93-962d-7c65-000000000126 28011 1726882532.91137: variable 'ansible_search_path' from source: unknown 28011 1726882532.91146: variable 'ansible_search_path' from source: unknown 28011 1726882532.91172: calling self._execute() 28011 1726882532.91228: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.91232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.91240: variable 'omit' from source: magic vars 28011 1726882532.91502: variable 'ansible_distribution' from source: facts 28011 1726882532.91511: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28011 1726882532.91597: variable 'ansible_distribution_major_version' from source: facts 28011 1726882532.91600: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28011 1726882532.91603: when evaluation is False, skipping this task 28011 1726882532.91605: _execute() done 28011 1726882532.91608: dumping result to json 28011 1726882532.91610: done dumping result, returning 28011 1726882532.91617: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [12673a56-9f93-962d-7c65-000000000126] 28011 1726882532.91623: sending task result for task 12673a56-9f93-962d-7c65-000000000126 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28011 1726882532.91764: no more pending results, returning what we have 28011 1726882532.91767: results queue empty 28011 1726882532.91768: checking for any_errors_fatal 28011 1726882532.91769: done checking for any_errors_fatal 28011 1726882532.91769: checking for max_fail_percentage 28011 1726882532.91771: done checking for max_fail_percentage 28011 1726882532.91771: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.91772: done checking to see if all hosts have failed 28011 1726882532.91773: getting the remaining hosts for this loop 28011 1726882532.91774: done getting the remaining hosts for this loop 28011 1726882532.91777: getting the next task for host managed_node1 28011 1726882532.91782: done getting next task for host managed_node1 28011 1726882532.91784: ^ task is: TASK: Install yum-utils package 28011 1726882532.91788: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.91795: getting variables 28011 1726882532.91797: in VariableManager get_vars() 28011 1726882532.91819: Calling all_inventory to load vars for managed_node1 28011 1726882532.91821: Calling groups_inventory to load vars for managed_node1 28011 1726882532.91824: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.91832: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.91834: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.91837: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.91946: done sending task result for task 12673a56-9f93-962d-7c65-000000000126 28011 1726882532.91949: WORKER PROCESS EXITING 28011 1726882532.91958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.92073: done with get_vars() 28011 1726882532.92080: done getting variables 28011 1726882532.92147: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:35:32 -0400 (0:00:00.014) 0:00:02.472 ****** 28011 1726882532.92165: entering _queue_task() for managed_node1/package 28011 1726882532.92166: Creating lock for package 28011 1726882532.92353: worker is 1 (out of 1 available) 28011 1726882532.92366: exiting _queue_task() for managed_node1/package 28011 1726882532.92376: done queuing things up, now waiting for results queue to drain 28011 1726882532.92377: waiting for pending results... 28011 1726882532.92516: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 28011 1726882532.92576: in run() - task 12673a56-9f93-962d-7c65-000000000127 28011 1726882532.92587: variable 'ansible_search_path' from source: unknown 28011 1726882532.92594: variable 'ansible_search_path' from source: unknown 28011 1726882532.92619: calling self._execute() 28011 1726882532.92729: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.92734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.92745: variable 'omit' from source: magic vars 28011 1726882532.92980: variable 'ansible_distribution' from source: facts 28011 1726882532.92989: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28011 1726882532.93072: variable 'ansible_distribution_major_version' from source: facts 28011 1726882532.93076: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28011 1726882532.93078: when evaluation is False, skipping this task 28011 1726882532.93081: _execute() done 28011 1726882532.93084: dumping result to json 28011 1726882532.93086: done dumping result, returning 28011 1726882532.93097: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [12673a56-9f93-962d-7c65-000000000127] 28011 1726882532.93100: sending task result for task 12673a56-9f93-962d-7c65-000000000127 28011 1726882532.93180: done sending task result for task 12673a56-9f93-962d-7c65-000000000127 28011 1726882532.93183: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28011 1726882532.93280: no more pending results, returning what we have 28011 1726882532.93282: results queue empty 28011 1726882532.93283: checking for any_errors_fatal 28011 1726882532.93285: done checking for any_errors_fatal 28011 1726882532.93286: checking for max_fail_percentage 28011 1726882532.93288: done checking for max_fail_percentage 28011 1726882532.93288: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.93289: done checking to see if all hosts have failed 28011 1726882532.93292: getting the remaining hosts for this loop 28011 1726882532.93300: done getting the remaining hosts for this loop 28011 1726882532.93303: getting the next task for host managed_node1 28011 1726882532.93307: done getting next task for host managed_node1 28011 1726882532.93308: ^ task is: TASK: Enable EPEL 7 28011 1726882532.93310: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.93313: getting variables 28011 1726882532.93314: in VariableManager get_vars() 28011 1726882532.93327: Calling all_inventory to load vars for managed_node1 28011 1726882532.93328: Calling groups_inventory to load vars for managed_node1 28011 1726882532.93330: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.93336: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.93338: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.93339: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.93436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.93547: done with get_vars() 28011 1726882532.93553: done getting variables 28011 1726882532.93587: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:35:32 -0400 (0:00:00.014) 0:00:02.487 ****** 28011 1726882532.93609: entering _queue_task() for managed_node1/command 28011 1726882532.93780: worker is 1 (out of 1 available) 28011 1726882532.93797: exiting _queue_task() for managed_node1/command 28011 1726882532.93808: done queuing things up, now waiting for results queue to drain 28011 1726882532.93810: waiting for pending results... 28011 1726882532.93938: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 28011 1726882532.94004: in run() - task 12673a56-9f93-962d-7c65-000000000128 28011 1726882532.94014: variable 'ansible_search_path' from source: unknown 28011 1726882532.94018: variable 'ansible_search_path' from source: unknown 28011 1726882532.94042: calling self._execute() 28011 1726882532.94096: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.94101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.94108: variable 'omit' from source: magic vars 28011 1726882532.94354: variable 'ansible_distribution' from source: facts 28011 1726882532.94363: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28011 1726882532.94450: variable 'ansible_distribution_major_version' from source: facts 28011 1726882532.94453: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28011 1726882532.94456: when evaluation is False, skipping this task 28011 1726882532.94459: _execute() done 28011 1726882532.94461: dumping result to json 28011 1726882532.94464: done dumping result, returning 28011 1726882532.94471: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [12673a56-9f93-962d-7c65-000000000128] 28011 1726882532.94475: sending task result for task 12673a56-9f93-962d-7c65-000000000128 28011 1726882532.94552: done sending task result for task 12673a56-9f93-962d-7c65-000000000128 28011 1726882532.94555: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28011 1726882532.94626: no more pending results, returning what we have 28011 1726882532.94628: results queue empty 28011 1726882532.94629: checking for any_errors_fatal 28011 1726882532.94632: done checking for any_errors_fatal 28011 1726882532.94633: checking for max_fail_percentage 28011 1726882532.94634: done checking for max_fail_percentage 28011 1726882532.94635: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.94636: done checking to see if all hosts have failed 28011 1726882532.94636: getting the remaining hosts for this loop 28011 1726882532.94637: done getting the remaining hosts for this loop 28011 1726882532.94640: getting the next task for host managed_node1 28011 1726882532.94644: done getting next task for host managed_node1 28011 1726882532.94646: ^ task is: TASK: Enable EPEL 8 28011 1726882532.94649: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.94652: getting variables 28011 1726882532.94653: in VariableManager get_vars() 28011 1726882532.94669: Calling all_inventory to load vars for managed_node1 28011 1726882532.94671: Calling groups_inventory to load vars for managed_node1 28011 1726882532.94672: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.94678: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.94680: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.94682: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.94782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.94915: done with get_vars() 28011 1726882532.94921: done getting variables 28011 1726882532.94956: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:35:32 -0400 (0:00:00.013) 0:00:02.501 ****** 28011 1726882532.94974: entering _queue_task() for managed_node1/command 28011 1726882532.95138: worker is 1 (out of 1 available) 28011 1726882532.95149: exiting _queue_task() for managed_node1/command 28011 1726882532.95160: done queuing things up, now waiting for results queue to drain 28011 1726882532.95162: waiting for pending results... 28011 1726882532.95289: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 28011 1726882532.95352: in run() - task 12673a56-9f93-962d-7c65-000000000129 28011 1726882532.95362: variable 'ansible_search_path' from source: unknown 28011 1726882532.95364: variable 'ansible_search_path' from source: unknown 28011 1726882532.95392: calling self._execute() 28011 1726882532.95444: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.95448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.95456: variable 'omit' from source: magic vars 28011 1726882532.95697: variable 'ansible_distribution' from source: facts 28011 1726882532.95705: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28011 1726882532.95788: variable 'ansible_distribution_major_version' from source: facts 28011 1726882532.95795: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28011 1726882532.95798: when evaluation is False, skipping this task 28011 1726882532.95801: _execute() done 28011 1726882532.95803: dumping result to json 28011 1726882532.95806: done dumping result, returning 28011 1726882532.95811: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [12673a56-9f93-962d-7c65-000000000129] 28011 1726882532.95815: sending task result for task 12673a56-9f93-962d-7c65-000000000129 28011 1726882532.95896: done sending task result for task 12673a56-9f93-962d-7c65-000000000129 28011 1726882532.95899: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28011 1726882532.95939: no more pending results, returning what we have 28011 1726882532.95942: results queue empty 28011 1726882532.95942: checking for any_errors_fatal 28011 1726882532.95947: done checking for any_errors_fatal 28011 1726882532.95947: checking for max_fail_percentage 28011 1726882532.95949: done checking for max_fail_percentage 28011 1726882532.95950: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.95950: done checking to see if all hosts have failed 28011 1726882532.95951: getting the remaining hosts for this loop 28011 1726882532.95952: done getting the remaining hosts for this loop 28011 1726882532.95955: getting the next task for host managed_node1 28011 1726882532.95961: done getting next task for host managed_node1 28011 1726882532.95963: ^ task is: TASK: Enable EPEL 6 28011 1726882532.95967: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.95969: getting variables 28011 1726882532.95971: in VariableManager get_vars() 28011 1726882532.96002: Calling all_inventory to load vars for managed_node1 28011 1726882532.96004: Calling groups_inventory to load vars for managed_node1 28011 1726882532.96006: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.96012: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.96014: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.96015: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.96120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.96233: done with get_vars() 28011 1726882532.96239: done getting variables 28011 1726882532.96273: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:35:32 -0400 (0:00:00.013) 0:00:02.514 ****** 28011 1726882532.96292: entering _queue_task() for managed_node1/copy 28011 1726882532.96452: worker is 1 (out of 1 available) 28011 1726882532.96463: exiting _queue_task() for managed_node1/copy 28011 1726882532.96474: done queuing things up, now waiting for results queue to drain 28011 1726882532.96475: waiting for pending results... 28011 1726882532.96605: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 28011 1726882532.96663: in run() - task 12673a56-9f93-962d-7c65-00000000012b 28011 1726882532.96672: variable 'ansible_search_path' from source: unknown 28011 1726882532.96675: variable 'ansible_search_path' from source: unknown 28011 1726882532.96702: calling self._execute() 28011 1726882532.96752: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.96756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.96764: variable 'omit' from source: magic vars 28011 1726882532.97035: variable 'ansible_distribution' from source: facts 28011 1726882532.97039: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28011 1726882532.97112: variable 'ansible_distribution_major_version' from source: facts 28011 1726882532.97116: Evaluated conditional (ansible_distribution_major_version == '6'): False 28011 1726882532.97119: when evaluation is False, skipping this task 28011 1726882532.97121: _execute() done 28011 1726882532.97124: dumping result to json 28011 1726882532.97128: done dumping result, returning 28011 1726882532.97134: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [12673a56-9f93-962d-7c65-00000000012b] 28011 1726882532.97139: sending task result for task 12673a56-9f93-962d-7c65-00000000012b 28011 1726882532.97219: done sending task result for task 12673a56-9f93-962d-7c65-00000000012b 28011 1726882532.97222: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 28011 1726882532.97292: no more pending results, returning what we have 28011 1726882532.97297: results queue empty 28011 1726882532.97297: checking for any_errors_fatal 28011 1726882532.97301: done checking for any_errors_fatal 28011 1726882532.97301: checking for max_fail_percentage 28011 1726882532.97303: done checking for max_fail_percentage 28011 1726882532.97303: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.97304: done checking to see if all hosts have failed 28011 1726882532.97305: getting the remaining hosts for this loop 28011 1726882532.97306: done getting the remaining hosts for this loop 28011 1726882532.97309: getting the next task for host managed_node1 28011 1726882532.97314: done getting next task for host managed_node1 28011 1726882532.97316: ^ task is: TASK: Set network provider to 'nm' 28011 1726882532.97319: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.97322: getting variables 28011 1726882532.97323: in VariableManager get_vars() 28011 1726882532.97341: Calling all_inventory to load vars for managed_node1 28011 1726882532.97342: Calling groups_inventory to load vars for managed_node1 28011 1726882532.97344: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.97350: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.97352: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.97354: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.97476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.97583: done with get_vars() 28011 1726882532.97588: done getting variables 28011 1726882532.97626: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:13 Friday 20 September 2024 21:35:32 -0400 (0:00:00.013) 0:00:02.527 ****** 28011 1726882532.97642: entering _queue_task() for managed_node1/set_fact 28011 1726882532.97796: worker is 1 (out of 1 available) 28011 1726882532.97806: exiting _queue_task() for managed_node1/set_fact 28011 1726882532.97815: done queuing things up, now waiting for results queue to drain 28011 1726882532.97817: waiting for pending results... 28011 1726882532.97947: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 28011 1726882532.97988: in run() - task 12673a56-9f93-962d-7c65-000000000007 28011 1726882532.98002: variable 'ansible_search_path' from source: unknown 28011 1726882532.98028: calling self._execute() 28011 1726882532.98081: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.98085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.98097: variable 'omit' from source: magic vars 28011 1726882532.98164: variable 'omit' from source: magic vars 28011 1726882532.98185: variable 'omit' from source: magic vars 28011 1726882532.98212: variable 'omit' from source: magic vars 28011 1726882532.98291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882532.98334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882532.98338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882532.98348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882532.98360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882532.98399: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882532.98403: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.98406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.98476: Set connection var ansible_connection to ssh 28011 1726882532.98483: Set connection var ansible_pipelining to False 28011 1726882532.98488: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882532.98496: Set connection var ansible_shell_executable to /bin/sh 28011 1726882532.98504: Set connection var ansible_timeout to 10 28011 1726882532.98508: Set connection var ansible_shell_type to sh 28011 1726882532.98525: variable 'ansible_shell_executable' from source: unknown 28011 1726882532.98528: variable 'ansible_connection' from source: unknown 28011 1726882532.98530: variable 'ansible_module_compression' from source: unknown 28011 1726882532.98540: variable 'ansible_shell_type' from source: unknown 28011 1726882532.98545: variable 'ansible_shell_executable' from source: unknown 28011 1726882532.98548: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882532.98550: variable 'ansible_pipelining' from source: unknown 28011 1726882532.98552: variable 'ansible_timeout' from source: unknown 28011 1726882532.98554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882532.98651: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882532.98657: variable 'omit' from source: magic vars 28011 1726882532.98660: starting attempt loop 28011 1726882532.98662: running the handler 28011 1726882532.98669: handler run complete 28011 1726882532.98676: attempt loop complete, returning result 28011 1726882532.98679: _execute() done 28011 1726882532.98681: dumping result to json 28011 1726882532.98685: done dumping result, returning 28011 1726882532.98690: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [12673a56-9f93-962d-7c65-000000000007] 28011 1726882532.98699: sending task result for task 12673a56-9f93-962d-7c65-000000000007 ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 28011 1726882532.98823: no more pending results, returning what we have 28011 1726882532.98825: results queue empty 28011 1726882532.98826: checking for any_errors_fatal 28011 1726882532.98830: done checking for any_errors_fatal 28011 1726882532.98831: checking for max_fail_percentage 28011 1726882532.98832: done checking for max_fail_percentage 28011 1726882532.98833: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.98834: done checking to see if all hosts have failed 28011 1726882532.98834: getting the remaining hosts for this loop 28011 1726882532.98835: done getting the remaining hosts for this loop 28011 1726882532.98838: getting the next task for host managed_node1 28011 1726882532.98842: done getting next task for host managed_node1 28011 1726882532.98844: ^ task is: TASK: meta (flush_handlers) 28011 1726882532.98846: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.98849: getting variables 28011 1726882532.98850: in VariableManager get_vars() 28011 1726882532.98873: Calling all_inventory to load vars for managed_node1 28011 1726882532.98875: Calling groups_inventory to load vars for managed_node1 28011 1726882532.98878: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.98885: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.98888: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.98891: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.98988: done sending task result for task 12673a56-9f93-962d-7c65-000000000007 28011 1726882532.98995: WORKER PROCESS EXITING 28011 1726882532.99007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.99116: done with get_vars() 28011 1726882532.99122: done getting variables 28011 1726882532.99161: in VariableManager get_vars() 28011 1726882532.99166: Calling all_inventory to load vars for managed_node1 28011 1726882532.99168: Calling groups_inventory to load vars for managed_node1 28011 1726882532.99169: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.99172: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.99173: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.99174: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.99255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.99378: done with get_vars() 28011 1726882532.99387: done queuing things up, now waiting for results queue to drain 28011 1726882532.99388: results queue empty 28011 1726882532.99388: checking for any_errors_fatal 28011 1726882532.99390: done checking for any_errors_fatal 28011 1726882532.99390: checking for max_fail_percentage 28011 1726882532.99391: done checking for max_fail_percentage 28011 1726882532.99392: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.99392: done checking to see if all hosts have failed 28011 1726882532.99394: getting the remaining hosts for this loop 28011 1726882532.99395: done getting the remaining hosts for this loop 28011 1726882532.99396: getting the next task for host managed_node1 28011 1726882532.99399: done getting next task for host managed_node1 28011 1726882532.99399: ^ task is: TASK: meta (flush_handlers) 28011 1726882532.99400: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.99405: getting variables 28011 1726882532.99406: in VariableManager get_vars() 28011 1726882532.99411: Calling all_inventory to load vars for managed_node1 28011 1726882532.99412: Calling groups_inventory to load vars for managed_node1 28011 1726882532.99413: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.99416: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.99417: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.99419: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.99497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.99601: done with get_vars() 28011 1726882532.99607: done getting variables 28011 1726882532.99634: in VariableManager get_vars() 28011 1726882532.99639: Calling all_inventory to load vars for managed_node1 28011 1726882532.99641: Calling groups_inventory to load vars for managed_node1 28011 1726882532.99642: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882532.99645: Calling all_plugins_play to load vars for managed_node1 28011 1726882532.99647: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882532.99649: Calling groups_plugins_play to load vars for managed_node1 28011 1726882532.99728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882532.99845: done with get_vars() 28011 1726882532.99852: done queuing things up, now waiting for results queue to drain 28011 1726882532.99853: results queue empty 28011 1726882532.99853: checking for any_errors_fatal 28011 1726882532.99854: done checking for any_errors_fatal 28011 1726882532.99854: checking for max_fail_percentage 28011 1726882532.99855: done checking for max_fail_percentage 28011 1726882532.99855: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.99856: done checking to see if all hosts have failed 28011 1726882532.99856: getting the remaining hosts for this loop 28011 1726882532.99857: done getting the remaining hosts for this loop 28011 1726882532.99858: getting the next task for host managed_node1 28011 1726882532.99860: done getting next task for host managed_node1 28011 1726882532.99860: ^ task is: None 28011 1726882532.99861: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.99862: done queuing things up, now waiting for results queue to drain 28011 1726882532.99862: results queue empty 28011 1726882532.99863: checking for any_errors_fatal 28011 1726882532.99863: done checking for any_errors_fatal 28011 1726882532.99864: checking for max_fail_percentage 28011 1726882532.99865: done checking for max_fail_percentage 28011 1726882532.99865: checking to see if all hosts have failed and the running result is not ok 28011 1726882532.99866: done checking to see if all hosts have failed 28011 1726882532.99868: getting the next task for host managed_node1 28011 1726882532.99870: done getting next task for host managed_node1 28011 1726882532.99870: ^ task is: None 28011 1726882532.99871: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882532.99920: in VariableManager get_vars() 28011 1726882532.99942: done with get_vars() 28011 1726882532.99947: in VariableManager get_vars() 28011 1726882532.99960: done with get_vars() 28011 1726882532.99964: variable 'omit' from source: magic vars 28011 1726882532.99997: in VariableManager get_vars() 28011 1726882533.00013: done with get_vars() 28011 1726882533.00035: variable 'omit' from source: magic vars PLAY [Play for testing route table] ******************************************** 28011 1726882533.00400: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28011 1726882533.00431: getting the remaining hosts for this loop 28011 1726882533.00432: done getting the remaining hosts for this loop 28011 1726882533.00435: getting the next task for host managed_node1 28011 1726882533.00437: done getting next task for host managed_node1 28011 1726882533.00439: ^ task is: TASK: Gathering Facts 28011 1726882533.00440: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882533.00442: getting variables 28011 1726882533.00443: in VariableManager get_vars() 28011 1726882533.00454: Calling all_inventory to load vars for managed_node1 28011 1726882533.00456: Calling groups_inventory to load vars for managed_node1 28011 1726882533.00458: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882533.00462: Calling all_plugins_play to load vars for managed_node1 28011 1726882533.00475: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882533.00478: Calling groups_plugins_play to load vars for managed_node1 28011 1726882533.00626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882533.00824: done with get_vars() 28011 1726882533.00833: done getting variables 28011 1726882533.00876: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:3 Friday 20 September 2024 21:35:33 -0400 (0:00:00.032) 0:00:02.560 ****** 28011 1726882533.00903: entering _queue_task() for managed_node1/gather_facts 28011 1726882533.01147: worker is 1 (out of 1 available) 28011 1726882533.01157: exiting _queue_task() for managed_node1/gather_facts 28011 1726882533.01167: done queuing things up, now waiting for results queue to drain 28011 1726882533.01168: waiting for pending results... 28011 1726882533.01400: running TaskExecutor() for managed_node1/TASK: Gathering Facts 28011 1726882533.01520: in run() - task 12673a56-9f93-962d-7c65-000000000151 28011 1726882533.01523: variable 'ansible_search_path' from source: unknown 28011 1726882533.01560: calling self._execute() 28011 1726882533.01716: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882533.01720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882533.01722: variable 'omit' from source: magic vars 28011 1726882533.02032: variable 'ansible_distribution_major_version' from source: facts 28011 1726882533.02055: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882533.02155: variable 'omit' from source: magic vars 28011 1726882533.02159: variable 'omit' from source: magic vars 28011 1726882533.02161: variable 'omit' from source: magic vars 28011 1726882533.02189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882533.02226: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882533.02249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882533.02277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882533.02302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882533.02372: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882533.02375: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882533.02378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882533.02462: Set connection var ansible_connection to ssh 28011 1726882533.02480: Set connection var ansible_pipelining to False 28011 1726882533.02503: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882533.02515: Set connection var ansible_shell_executable to /bin/sh 28011 1726882533.02528: Set connection var ansible_timeout to 10 28011 1726882533.02542: Set connection var ansible_shell_type to sh 28011 1726882533.02558: variable 'ansible_shell_executable' from source: unknown 28011 1726882533.02561: variable 'ansible_connection' from source: unknown 28011 1726882533.02563: variable 'ansible_module_compression' from source: unknown 28011 1726882533.02607: variable 'ansible_shell_type' from source: unknown 28011 1726882533.02611: variable 'ansible_shell_executable' from source: unknown 28011 1726882533.02614: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882533.02617: variable 'ansible_pipelining' from source: unknown 28011 1726882533.02620: variable 'ansible_timeout' from source: unknown 28011 1726882533.02622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882533.02802: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882533.02812: variable 'omit' from source: magic vars 28011 1726882533.02816: starting attempt loop 28011 1726882533.02819: running the handler 28011 1726882533.02834: variable 'ansible_facts' from source: unknown 28011 1726882533.02851: _low_level_execute_command(): starting 28011 1726882533.02858: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882533.03336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882533.03340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882533.03342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882533.03344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882533.03397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882533.03403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882533.03449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882533.05067: stdout chunk (state=3): >>>/root <<< 28011 1726882533.05157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882533.05160: stdout chunk (state=3): >>><<< 28011 1726882533.05197: stderr chunk (state=3): >>><<< 28011 1726882533.05215: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882533.05321: _low_level_execute_command(): starting 28011 1726882533.05325: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476 `" && echo ansible-tmp-1726882533.0523791-28158-60111999048476="` echo /root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476 `" ) && sleep 0' 28011 1726882533.06637: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882533.06643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882533.06646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882533.06656: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882533.06658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882533.06787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882533.06811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882533.06822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882533.06999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882533.08826: stdout chunk (state=3): >>>ansible-tmp-1726882533.0523791-28158-60111999048476=/root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476 <<< 28011 1726882533.08999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882533.09002: stdout chunk (state=3): >>><<< 28011 1726882533.09005: stderr chunk (state=3): >>><<< 28011 1726882533.09022: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882533.0523791-28158-60111999048476=/root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882533.09398: variable 'ansible_module_compression' from source: unknown 28011 1726882533.09401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28011 1726882533.09404: variable 'ansible_facts' from source: unknown 28011 1726882533.09730: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476/AnsiballZ_setup.py 28011 1726882533.10323: Sending initial data 28011 1726882533.10326: Sent initial data (153 bytes) 28011 1726882533.11251: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882533.11420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882533.11423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882533.11425: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882533.11427: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882533.11429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882533.11431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882533.11604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882533.11631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882533.13144: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882533.13204: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882533.13263: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpr055lxx2 /root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476/AnsiballZ_setup.py <<< 28011 1726882533.13283: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476/AnsiballZ_setup.py" <<< 28011 1726882533.13348: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpr055lxx2" to remote "/root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476/AnsiballZ_setup.py" <<< 28011 1726882533.15671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882533.15746: stderr chunk (state=3): >>><<< 28011 1726882533.15771: stdout chunk (state=3): >>><<< 28011 1726882533.15799: done transferring module to remote 28011 1726882533.15815: _low_level_execute_command(): starting 28011 1726882533.15833: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476/ /root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476/AnsiballZ_setup.py && sleep 0' 28011 1726882533.16601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882533.16697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882533.16734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882533.16752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882533.16779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882533.16862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882533.18570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882533.18599: stderr chunk (state=3): >>><<< 28011 1726882533.18602: stdout chunk (state=3): >>><<< 28011 1726882533.18610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882533.18613: _low_level_execute_command(): starting 28011 1726882533.18618: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476/AnsiballZ_setup.py && sleep 0' 28011 1726882533.19037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882533.19040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882533.19042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882533.19044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882533.19126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882533.19168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882533.82376: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.52392578125, "5m": 0.3984375, "15m": 0.2158203125}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2953, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 578, "free": 2953}, <<< 28011 1726882533.82427: stdout chunk (state=3): >>>"nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 966, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794066432, "block_size": 4096, "block_total": 65519099, "block_available": 63914567, "block_used": 1604532, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "33", "epoch": "1726882533", "epoch_int": "1726882533", "date": "2024-09-20", "time": "21:35:33", "iso8601_micro": "2024-09-21T01:35:33.781051Z", "iso8601": "2024-09-21T01:35:33Z", "iso8601_basic": "20240920T213533781051", "iso8601_basic_short": "20240920T213533", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28011 1726882533.84288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882533.84297: stdout chunk (state=3): >>><<< 28011 1726882533.84299: stderr chunk (state=3): >>><<< 28011 1726882533.84502: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.52392578125, "5m": 0.3984375, "15m": 0.2158203125}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2953, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 578, "free": 2953}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 966, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794066432, "block_size": 4096, "block_total": 65519099, "block_available": 63914567, "block_used": 1604532, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "33", "epoch": "1726882533", "epoch_int": "1726882533", "date": "2024-09-20", "time": "21:35:33", "iso8601_micro": "2024-09-21T01:35:33.781051Z", "iso8601": "2024-09-21T01:35:33Z", "iso8601_basic": "20240920T213533781051", "iso8601_basic_short": "20240920T213533", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882533.84881: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882533.84935: _low_level_execute_command(): starting 28011 1726882533.84945: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882533.0523791-28158-60111999048476/ > /dev/null 2>&1 && sleep 0' 28011 1726882533.86221: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882533.86376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882533.86465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882533.86543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882533.88399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882533.88443: stdout chunk (state=3): >>><<< 28011 1726882533.88902: stderr chunk (state=3): >>><<< 28011 1726882533.88905: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882533.88907: handler run complete 28011 1726882533.88909: variable 'ansible_facts' from source: unknown 28011 1726882533.88937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882533.89537: variable 'ansible_facts' from source: unknown 28011 1726882533.89899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882533.90074: attempt loop complete, returning result 28011 1726882533.90138: _execute() done 28011 1726882533.90146: dumping result to json 28011 1726882533.90179: done dumping result, returning 28011 1726882533.90214: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-962d-7c65-000000000151] 28011 1726882533.90228: sending task result for task 12673a56-9f93-962d-7c65-000000000151 ok: [managed_node1] 28011 1726882533.91179: no more pending results, returning what we have 28011 1726882533.91182: results queue empty 28011 1726882533.91183: checking for any_errors_fatal 28011 1726882533.91184: done checking for any_errors_fatal 28011 1726882533.91185: checking for max_fail_percentage 28011 1726882533.91186: done checking for max_fail_percentage 28011 1726882533.91187: checking to see if all hosts have failed and the running result is not ok 28011 1726882533.91187: done checking to see if all hosts have failed 28011 1726882533.91188: getting the remaining hosts for this loop 28011 1726882533.91192: done getting the remaining hosts for this loop 28011 1726882533.91281: getting the next task for host managed_node1 28011 1726882533.91328: done getting next task for host managed_node1 28011 1726882533.91330: ^ task is: TASK: meta (flush_handlers) 28011 1726882533.91332: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882533.91336: getting variables 28011 1726882533.91338: in VariableManager get_vars() 28011 1726882533.91366: Calling all_inventory to load vars for managed_node1 28011 1726882533.91368: Calling groups_inventory to load vars for managed_node1 28011 1726882533.91371: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882533.91377: done sending task result for task 12673a56-9f93-962d-7c65-000000000151 28011 1726882533.91379: WORKER PROCESS EXITING 28011 1726882533.91389: Calling all_plugins_play to load vars for managed_node1 28011 1726882533.91418: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882533.91425: Calling groups_plugins_play to load vars for managed_node1 28011 1726882533.91656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882533.91853: done with get_vars() 28011 1726882533.91868: done getting variables 28011 1726882533.91941: in VariableManager get_vars() 28011 1726882533.91955: Calling all_inventory to load vars for managed_node1 28011 1726882533.91957: Calling groups_inventory to load vars for managed_node1 28011 1726882533.91960: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882533.91964: Calling all_plugins_play to load vars for managed_node1 28011 1726882533.91966: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882533.91969: Calling groups_plugins_play to load vars for managed_node1 28011 1726882533.92118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882533.92313: done with get_vars() 28011 1726882533.92324: done queuing things up, now waiting for results queue to drain 28011 1726882533.92325: results queue empty 28011 1726882533.92326: checking for any_errors_fatal 28011 1726882533.92329: done checking for any_errors_fatal 28011 1726882533.92330: checking for max_fail_percentage 28011 1726882533.92331: done checking for max_fail_percentage 28011 1726882533.92331: checking to see if all hosts have failed and the running result is not ok 28011 1726882533.92332: done checking to see if all hosts have failed 28011 1726882533.92336: getting the remaining hosts for this loop 28011 1726882533.92337: done getting the remaining hosts for this loop 28011 1726882533.92339: getting the next task for host managed_node1 28011 1726882533.92343: done getting next task for host managed_node1 28011 1726882533.92345: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 28011 1726882533.92346: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882533.92348: getting variables 28011 1726882533.92349: in VariableManager get_vars() 28011 1726882533.92359: Calling all_inventory to load vars for managed_node1 28011 1726882533.92360: Calling groups_inventory to load vars for managed_node1 28011 1726882533.92362: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882533.92366: Calling all_plugins_play to load vars for managed_node1 28011 1726882533.92368: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882533.92370: Calling groups_plugins_play to load vars for managed_node1 28011 1726882533.92510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882533.92725: done with get_vars() 28011 1726882533.92736: done getting variables 28011 1726882533.92770: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882533.92913: variable 'type' from source: play vars 28011 1726882533.92918: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:11 Friday 20 September 2024 21:35:33 -0400 (0:00:00.920) 0:00:03.480 ****** 28011 1726882533.92960: entering _queue_task() for managed_node1/set_fact 28011 1726882533.93370: worker is 1 (out of 1 available) 28011 1726882533.93501: exiting _queue_task() for managed_node1/set_fact 28011 1726882533.93511: done queuing things up, now waiting for results queue to drain 28011 1726882533.93512: waiting for pending results... 28011 1726882533.93840: running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=ethtest0 28011 1726882533.94051: in run() - task 12673a56-9f93-962d-7c65-00000000000b 28011 1726882533.94072: variable 'ansible_search_path' from source: unknown 28011 1726882533.94151: calling self._execute() 28011 1726882533.94370: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882533.94383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882533.94403: variable 'omit' from source: magic vars 28011 1726882533.94780: variable 'ansible_distribution_major_version' from source: facts 28011 1726882533.94809: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882533.94821: variable 'omit' from source: magic vars 28011 1726882533.94843: variable 'omit' from source: magic vars 28011 1726882533.94875: variable 'type' from source: play vars 28011 1726882533.94953: variable 'type' from source: play vars 28011 1726882533.94966: variable 'interface' from source: play vars 28011 1726882533.95110: variable 'interface' from source: play vars 28011 1726882533.95113: variable 'omit' from source: magic vars 28011 1726882533.95115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882533.95139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882533.95164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882533.95185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882533.95205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882533.95247: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882533.95257: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882533.95264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882533.95373: Set connection var ansible_connection to ssh 28011 1726882533.95386: Set connection var ansible_pipelining to False 28011 1726882533.95402: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882533.95413: Set connection var ansible_shell_executable to /bin/sh 28011 1726882533.95425: Set connection var ansible_timeout to 10 28011 1726882533.95445: Set connection var ansible_shell_type to sh 28011 1726882533.95472: variable 'ansible_shell_executable' from source: unknown 28011 1726882533.95481: variable 'ansible_connection' from source: unknown 28011 1726882533.95488: variable 'ansible_module_compression' from source: unknown 28011 1726882533.95546: variable 'ansible_shell_type' from source: unknown 28011 1726882533.95549: variable 'ansible_shell_executable' from source: unknown 28011 1726882533.95551: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882533.95553: variable 'ansible_pipelining' from source: unknown 28011 1726882533.95555: variable 'ansible_timeout' from source: unknown 28011 1726882533.95557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882533.95682: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882533.95704: variable 'omit' from source: magic vars 28011 1726882533.95714: starting attempt loop 28011 1726882533.95721: running the handler 28011 1726882533.95737: handler run complete 28011 1726882533.95764: attempt loop complete, returning result 28011 1726882533.95767: _execute() done 28011 1726882533.95769: dumping result to json 28011 1726882533.95797: done dumping result, returning 28011 1726882533.95800: done running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=ethtest0 [12673a56-9f93-962d-7c65-00000000000b] 28011 1726882533.95803: sending task result for task 12673a56-9f93-962d-7c65-00000000000b 28011 1726882533.96118: done sending task result for task 12673a56-9f93-962d-7c65-00000000000b 28011 1726882533.96121: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 28011 1726882533.96161: no more pending results, returning what we have 28011 1726882533.96163: results queue empty 28011 1726882533.96164: checking for any_errors_fatal 28011 1726882533.96165: done checking for any_errors_fatal 28011 1726882533.96166: checking for max_fail_percentage 28011 1726882533.96168: done checking for max_fail_percentage 28011 1726882533.96168: checking to see if all hosts have failed and the running result is not ok 28011 1726882533.96169: done checking to see if all hosts have failed 28011 1726882533.96170: getting the remaining hosts for this loop 28011 1726882533.96171: done getting the remaining hosts for this loop 28011 1726882533.96174: getting the next task for host managed_node1 28011 1726882533.96178: done getting next task for host managed_node1 28011 1726882533.96180: ^ task is: TASK: Include the task 'show_interfaces.yml' 28011 1726882533.96181: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882533.96185: getting variables 28011 1726882533.96186: in VariableManager get_vars() 28011 1726882533.96221: Calling all_inventory to load vars for managed_node1 28011 1726882533.96224: Calling groups_inventory to load vars for managed_node1 28011 1726882533.96226: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882533.96238: Calling all_plugins_play to load vars for managed_node1 28011 1726882533.96241: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882533.96244: Calling groups_plugins_play to load vars for managed_node1 28011 1726882533.96429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882533.96618: done with get_vars() 28011 1726882533.96627: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:15 Friday 20 September 2024 21:35:33 -0400 (0:00:00.037) 0:00:03.518 ****** 28011 1726882533.96722: entering _queue_task() for managed_node1/include_tasks 28011 1726882533.96954: worker is 1 (out of 1 available) 28011 1726882533.96965: exiting _queue_task() for managed_node1/include_tasks 28011 1726882533.96976: done queuing things up, now waiting for results queue to drain 28011 1726882533.96977: waiting for pending results... 28011 1726882533.97225: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 28011 1726882533.97325: in run() - task 12673a56-9f93-962d-7c65-00000000000c 28011 1726882533.97398: variable 'ansible_search_path' from source: unknown 28011 1726882533.97402: calling self._execute() 28011 1726882533.97467: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882533.97477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882533.97495: variable 'omit' from source: magic vars 28011 1726882533.97853: variable 'ansible_distribution_major_version' from source: facts 28011 1726882533.97873: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882533.97888: _execute() done 28011 1726882533.97900: dumping result to json 28011 1726882533.97908: done dumping result, returning 28011 1726882533.97980: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-962d-7c65-00000000000c] 28011 1726882533.97983: sending task result for task 12673a56-9f93-962d-7c65-00000000000c 28011 1726882533.98052: done sending task result for task 12673a56-9f93-962d-7c65-00000000000c 28011 1726882533.98055: WORKER PROCESS EXITING 28011 1726882533.98080: no more pending results, returning what we have 28011 1726882533.98088: in VariableManager get_vars() 28011 1726882533.98137: Calling all_inventory to load vars for managed_node1 28011 1726882533.98139: Calling groups_inventory to load vars for managed_node1 28011 1726882533.98142: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882533.98154: Calling all_plugins_play to load vars for managed_node1 28011 1726882533.98157: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882533.98160: Calling groups_plugins_play to load vars for managed_node1 28011 1726882533.98552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882533.98759: done with get_vars() 28011 1726882533.98766: variable 'ansible_search_path' from source: unknown 28011 1726882533.98778: we have included files to process 28011 1726882533.98779: generating all_blocks data 28011 1726882533.98780: done generating all_blocks data 28011 1726882533.98781: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28011 1726882533.98782: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28011 1726882533.98784: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28011 1726882533.98934: in VariableManager get_vars() 28011 1726882533.98960: done with get_vars() 28011 1726882533.99068: done processing included file 28011 1726882533.99070: iterating over new_blocks loaded from include file 28011 1726882533.99072: in VariableManager get_vars() 28011 1726882533.99088: done with get_vars() 28011 1726882533.99092: filtering new block on tags 28011 1726882533.99109: done filtering new block on tags 28011 1726882533.99112: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 28011 1726882533.99116: extending task lists for all hosts with included blocks 28011 1726882534.01159: done extending task lists 28011 1726882534.01161: done processing included files 28011 1726882534.01161: results queue empty 28011 1726882534.01162: checking for any_errors_fatal 28011 1726882534.01164: done checking for any_errors_fatal 28011 1726882534.01165: checking for max_fail_percentage 28011 1726882534.01166: done checking for max_fail_percentage 28011 1726882534.01167: checking to see if all hosts have failed and the running result is not ok 28011 1726882534.01168: done checking to see if all hosts have failed 28011 1726882534.01169: getting the remaining hosts for this loop 28011 1726882534.01170: done getting the remaining hosts for this loop 28011 1726882534.01172: getting the next task for host managed_node1 28011 1726882534.01176: done getting next task for host managed_node1 28011 1726882534.01179: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28011 1726882534.01181: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882534.01183: getting variables 28011 1726882534.01184: in VariableManager get_vars() 28011 1726882534.01202: Calling all_inventory to load vars for managed_node1 28011 1726882534.01205: Calling groups_inventory to load vars for managed_node1 28011 1726882534.01207: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.01212: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.01215: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.01219: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.01368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.01581: done with get_vars() 28011 1726882534.01594: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:35:34 -0400 (0:00:00.049) 0:00:03.567 ****** 28011 1726882534.01667: entering _queue_task() for managed_node1/include_tasks 28011 1726882534.01984: worker is 1 (out of 1 available) 28011 1726882534.02201: exiting _queue_task() for managed_node1/include_tasks 28011 1726882534.02211: done queuing things up, now waiting for results queue to drain 28011 1726882534.02212: waiting for pending results... 28011 1726882534.02340: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 28011 1726882534.02399: in run() - task 12673a56-9f93-962d-7c65-000000000169 28011 1726882534.02402: variable 'ansible_search_path' from source: unknown 28011 1726882534.02404: variable 'ansible_search_path' from source: unknown 28011 1726882534.02418: calling self._execute() 28011 1726882534.02500: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.02511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.02532: variable 'omit' from source: magic vars 28011 1726882534.02963: variable 'ansible_distribution_major_version' from source: facts 28011 1726882534.02966: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882534.02971: _execute() done 28011 1726882534.02978: dumping result to json 28011 1726882534.02980: done dumping result, returning 28011 1726882534.02983: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-962d-7c65-000000000169] 28011 1726882534.02986: sending task result for task 12673a56-9f93-962d-7c65-000000000169 28011 1726882534.03112: no more pending results, returning what we have 28011 1726882534.03117: in VariableManager get_vars() 28011 1726882534.03162: Calling all_inventory to load vars for managed_node1 28011 1726882534.03165: Calling groups_inventory to load vars for managed_node1 28011 1726882534.03167: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.03401: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.03405: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.03408: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.03559: done sending task result for task 12673a56-9f93-962d-7c65-000000000169 28011 1726882534.03563: WORKER PROCESS EXITING 28011 1726882534.03584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.03806: done with get_vars() 28011 1726882534.03813: variable 'ansible_search_path' from source: unknown 28011 1726882534.03814: variable 'ansible_search_path' from source: unknown 28011 1726882534.03854: we have included files to process 28011 1726882534.03855: generating all_blocks data 28011 1726882534.03857: done generating all_blocks data 28011 1726882534.03858: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28011 1726882534.03859: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28011 1726882534.03861: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28011 1726882534.04180: done processing included file 28011 1726882534.04182: iterating over new_blocks loaded from include file 28011 1726882534.04184: in VariableManager get_vars() 28011 1726882534.04206: done with get_vars() 28011 1726882534.04208: filtering new block on tags 28011 1726882534.04224: done filtering new block on tags 28011 1726882534.04226: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 28011 1726882534.04231: extending task lists for all hosts with included blocks 28011 1726882534.04335: done extending task lists 28011 1726882534.04336: done processing included files 28011 1726882534.04337: results queue empty 28011 1726882534.04338: checking for any_errors_fatal 28011 1726882534.04340: done checking for any_errors_fatal 28011 1726882534.04341: checking for max_fail_percentage 28011 1726882534.04342: done checking for max_fail_percentage 28011 1726882534.04342: checking to see if all hosts have failed and the running result is not ok 28011 1726882534.04343: done checking to see if all hosts have failed 28011 1726882534.04344: getting the remaining hosts for this loop 28011 1726882534.04345: done getting the remaining hosts for this loop 28011 1726882534.04347: getting the next task for host managed_node1 28011 1726882534.04351: done getting next task for host managed_node1 28011 1726882534.04353: ^ task is: TASK: Gather current interface info 28011 1726882534.04355: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882534.04358: getting variables 28011 1726882534.04359: in VariableManager get_vars() 28011 1726882534.04376: Calling all_inventory to load vars for managed_node1 28011 1726882534.04378: Calling groups_inventory to load vars for managed_node1 28011 1726882534.04380: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.04385: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.04388: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.04395: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.04745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.04938: done with get_vars() 28011 1726882534.04945: done getting variables 28011 1726882534.04976: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:35:34 -0400 (0:00:00.033) 0:00:03.601 ****** 28011 1726882534.05003: entering _queue_task() for managed_node1/command 28011 1726882534.05366: worker is 1 (out of 1 available) 28011 1726882534.05374: exiting _queue_task() for managed_node1/command 28011 1726882534.05383: done queuing things up, now waiting for results queue to drain 28011 1726882534.05385: waiting for pending results... 28011 1726882534.05532: running TaskExecutor() for managed_node1/TASK: Gather current interface info 28011 1726882534.05651: in run() - task 12673a56-9f93-962d-7c65-00000000024e 28011 1726882534.05669: variable 'ansible_search_path' from source: unknown 28011 1726882534.05681: variable 'ansible_search_path' from source: unknown 28011 1726882534.05726: calling self._execute() 28011 1726882534.05811: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.05827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.05841: variable 'omit' from source: magic vars 28011 1726882534.06221: variable 'ansible_distribution_major_version' from source: facts 28011 1726882534.06240: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882534.06250: variable 'omit' from source: magic vars 28011 1726882534.06306: variable 'omit' from source: magic vars 28011 1726882534.06348: variable 'omit' from source: magic vars 28011 1726882534.06398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882534.06435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882534.06465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882534.06492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882534.06554: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882534.06557: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882534.06559: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.06561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.06655: Set connection var ansible_connection to ssh 28011 1726882534.06673: Set connection var ansible_pipelining to False 28011 1726882534.06682: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882534.06700: Set connection var ansible_shell_executable to /bin/sh 28011 1726882534.06712: Set connection var ansible_timeout to 10 28011 1726882534.06720: Set connection var ansible_shell_type to sh 28011 1726882534.06745: variable 'ansible_shell_executable' from source: unknown 28011 1726882534.06771: variable 'ansible_connection' from source: unknown 28011 1726882534.06774: variable 'ansible_module_compression' from source: unknown 28011 1726882534.06776: variable 'ansible_shell_type' from source: unknown 28011 1726882534.06778: variable 'ansible_shell_executable' from source: unknown 28011 1726882534.06780: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.06782: variable 'ansible_pipelining' from source: unknown 28011 1726882534.06798: variable 'ansible_timeout' from source: unknown 28011 1726882534.06801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.06940: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882534.06989: variable 'omit' from source: magic vars 28011 1726882534.06996: starting attempt loop 28011 1726882534.06999: running the handler 28011 1726882534.07001: _low_level_execute_command(): starting 28011 1726882534.07003: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882534.07810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882534.07886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882534.07929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882534.07933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882534.08019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882534.09735: stdout chunk (state=3): >>>/root <<< 28011 1726882534.09864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882534.09869: stdout chunk (state=3): >>><<< 28011 1726882534.09871: stderr chunk (state=3): >>><<< 28011 1726882534.09992: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882534.10000: _low_level_execute_command(): starting 28011 1726882534.10003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544 `" && echo ansible-tmp-1726882534.09903-28208-239909407936544="` echo /root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544 `" ) && sleep 0' 28011 1726882534.10561: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882534.10574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882534.10586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882534.10608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882534.10622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882534.10644: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882534.10753: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882534.10788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882534.10825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882534.12910: stdout chunk (state=3): >>>ansible-tmp-1726882534.09903-28208-239909407936544=/root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544 <<< 28011 1726882534.12914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882534.12916: stdout chunk (state=3): >>><<< 28011 1726882534.12918: stderr chunk (state=3): >>><<< 28011 1726882534.12921: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882534.09903-28208-239909407936544=/root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882534.12942: variable 'ansible_module_compression' from source: unknown 28011 1726882534.12997: ANSIBALLZ: Using generic lock for ansible.legacy.command 28011 1726882534.13005: ANSIBALLZ: Acquiring lock 28011 1726882534.13017: ANSIBALLZ: Lock acquired: 139767565767152 28011 1726882534.13098: ANSIBALLZ: Creating module 28011 1726882534.30561: ANSIBALLZ: Writing module into payload 28011 1726882534.30781: ANSIBALLZ: Writing module 28011 1726882534.30818: ANSIBALLZ: Renaming module 28011 1726882534.30864: ANSIBALLZ: Done creating module 28011 1726882534.30896: variable 'ansible_facts' from source: unknown 28011 1726882534.30974: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544/AnsiballZ_command.py 28011 1726882534.31232: Sending initial data 28011 1726882534.31235: Sent initial data (154 bytes) 28011 1726882534.31791: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882534.31809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882534.31817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882534.31832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882534.31845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882534.31856: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882534.31867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882534.31965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882534.31989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882534.32039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882534.33633: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882534.33784: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882534.33820: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpsnikn8_0 /root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544/AnsiballZ_command.py <<< 28011 1726882534.33922: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544/AnsiballZ_command.py" <<< 28011 1726882534.33936: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpsnikn8_0" to remote "/root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544/AnsiballZ_command.py" <<< 28011 1726882534.34877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882534.34960: stderr chunk (state=3): >>><<< 28011 1726882534.34974: stdout chunk (state=3): >>><<< 28011 1726882534.35009: done transferring module to remote 28011 1726882534.35030: _low_level_execute_command(): starting 28011 1726882534.35045: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544/ /root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544/AnsiballZ_command.py && sleep 0' 28011 1726882534.35810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882534.35830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882534.35863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882534.35937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882534.37694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882534.37796: stdout chunk (state=3): >>><<< 28011 1726882534.37799: stderr chunk (state=3): >>><<< 28011 1726882534.37802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882534.37805: _low_level_execute_command(): starting 28011 1726882534.37807: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544/AnsiballZ_command.py && sleep 0' 28011 1726882534.38389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882534.38407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882534.38442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882534.38457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882534.38477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882534.38563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882534.53747: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:34.533124", "end": "2024-09-20 21:35:34.536321", "delta": "0:00:00.003197", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882534.55300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882534.55305: stdout chunk (state=3): >>><<< 28011 1726882534.55307: stderr chunk (state=3): >>><<< 28011 1726882534.55310: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:34.533124", "end": "2024-09-20 21:35:34.536321", "delta": "0:00:00.003197", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882534.55312: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882534.55314: _low_level_execute_command(): starting 28011 1726882534.55316: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882534.09903-28208-239909407936544/ > /dev/null 2>&1 && sleep 0' 28011 1726882534.55995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882534.56000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882534.56002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882534.56004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882534.56006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882534.56199: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882534.56203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882534.56205: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882534.56211: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882534.56213: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28011 1726882534.56220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882534.56224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882534.56228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882534.56231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882534.56297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882534.58073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882534.58123: stderr chunk (state=3): >>><<< 28011 1726882534.58126: stdout chunk (state=3): >>><<< 28011 1726882534.58156: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882534.58162: handler run complete 28011 1726882534.58186: Evaluated conditional (False): False 28011 1726882534.58199: attempt loop complete, returning result 28011 1726882534.58202: _execute() done 28011 1726882534.58204: dumping result to json 28011 1726882534.58210: done dumping result, returning 28011 1726882534.58218: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [12673a56-9f93-962d-7c65-00000000024e] 28011 1726882534.58220: sending task result for task 12673a56-9f93-962d-7c65-00000000024e 28011 1726882534.58335: done sending task result for task 12673a56-9f93-962d-7c65-00000000024e 28011 1726882534.58338: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003197", "end": "2024-09-20 21:35:34.536321", "rc": 0, "start": "2024-09-20 21:35:34.533124" } STDOUT: bonding_masters eth0 lo 28011 1726882534.58538: no more pending results, returning what we have 28011 1726882534.58542: results queue empty 28011 1726882534.58542: checking for any_errors_fatal 28011 1726882534.58544: done checking for any_errors_fatal 28011 1726882534.58545: checking for max_fail_percentage 28011 1726882534.58547: done checking for max_fail_percentage 28011 1726882534.58548: checking to see if all hosts have failed and the running result is not ok 28011 1726882534.58548: done checking to see if all hosts have failed 28011 1726882534.58549: getting the remaining hosts for this loop 28011 1726882534.58550: done getting the remaining hosts for this loop 28011 1726882534.58554: getting the next task for host managed_node1 28011 1726882534.58560: done getting next task for host managed_node1 28011 1726882534.58563: ^ task is: TASK: Set current_interfaces 28011 1726882534.58567: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882534.58571: getting variables 28011 1726882534.58573: in VariableManager get_vars() 28011 1726882534.58617: Calling all_inventory to load vars for managed_node1 28011 1726882534.58620: Calling groups_inventory to load vars for managed_node1 28011 1726882534.58623: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.58634: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.58637: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.58640: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.59055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.59478: done with get_vars() 28011 1726882534.59488: done getting variables 28011 1726882534.59684: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:35:34 -0400 (0:00:00.547) 0:00:04.148 ****** 28011 1726882534.59712: entering _queue_task() for managed_node1/set_fact 28011 1726882534.60246: worker is 1 (out of 1 available) 28011 1726882534.60257: exiting _queue_task() for managed_node1/set_fact 28011 1726882534.60268: done queuing things up, now waiting for results queue to drain 28011 1726882534.60270: waiting for pending results... 28011 1726882534.60811: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 28011 1726882534.60999: in run() - task 12673a56-9f93-962d-7c65-00000000024f 28011 1726882534.61003: variable 'ansible_search_path' from source: unknown 28011 1726882534.61006: variable 'ansible_search_path' from source: unknown 28011 1726882534.61009: calling self._execute() 28011 1726882534.61205: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.61598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.61602: variable 'omit' from source: magic vars 28011 1726882534.62066: variable 'ansible_distribution_major_version' from source: facts 28011 1726882534.62088: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882534.62119: variable 'omit' from source: magic vars 28011 1726882534.62406: variable 'omit' from source: magic vars 28011 1726882534.62458: variable '_current_interfaces' from source: set_fact 28011 1726882534.62520: variable 'omit' from source: magic vars 28011 1726882534.62634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882534.62834: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882534.62861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882534.62884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882534.62904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882534.62937: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882534.62946: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.62953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.63050: Set connection var ansible_connection to ssh 28011 1726882534.63243: Set connection var ansible_pipelining to False 28011 1726882534.63255: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882534.63265: Set connection var ansible_shell_executable to /bin/sh 28011 1726882534.63278: Set connection var ansible_timeout to 10 28011 1726882534.63287: Set connection var ansible_shell_type to sh 28011 1726882534.63320: variable 'ansible_shell_executable' from source: unknown 28011 1726882534.63329: variable 'ansible_connection' from source: unknown 28011 1726882534.63336: variable 'ansible_module_compression' from source: unknown 28011 1726882534.63344: variable 'ansible_shell_type' from source: unknown 28011 1726882534.63400: variable 'ansible_shell_executable' from source: unknown 28011 1726882534.63408: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.63414: variable 'ansible_pipelining' from source: unknown 28011 1726882534.63421: variable 'ansible_timeout' from source: unknown 28011 1726882534.63428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.63751: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882534.63767: variable 'omit' from source: magic vars 28011 1726882534.63776: starting attempt loop 28011 1726882534.63783: running the handler 28011 1726882534.63805: handler run complete 28011 1726882534.63998: attempt loop complete, returning result 28011 1726882534.64001: _execute() done 28011 1726882534.64004: dumping result to json 28011 1726882534.64006: done dumping result, returning 28011 1726882534.64008: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [12673a56-9f93-962d-7c65-00000000024f] 28011 1726882534.64011: sending task result for task 12673a56-9f93-962d-7c65-00000000024f 28011 1726882534.64083: done sending task result for task 12673a56-9f93-962d-7c65-00000000024f 28011 1726882534.64088: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 28011 1726882534.64150: no more pending results, returning what we have 28011 1726882534.64154: results queue empty 28011 1726882534.64154: checking for any_errors_fatal 28011 1726882534.64162: done checking for any_errors_fatal 28011 1726882534.64162: checking for max_fail_percentage 28011 1726882534.64164: done checking for max_fail_percentage 28011 1726882534.64164: checking to see if all hosts have failed and the running result is not ok 28011 1726882534.64165: done checking to see if all hosts have failed 28011 1726882534.64166: getting the remaining hosts for this loop 28011 1726882534.64167: done getting the remaining hosts for this loop 28011 1726882534.64171: getting the next task for host managed_node1 28011 1726882534.64178: done getting next task for host managed_node1 28011 1726882534.64180: ^ task is: TASK: Show current_interfaces 28011 1726882534.64183: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882534.64187: getting variables 28011 1726882534.64188: in VariableManager get_vars() 28011 1726882534.64228: Calling all_inventory to load vars for managed_node1 28011 1726882534.64230: Calling groups_inventory to load vars for managed_node1 28011 1726882534.64232: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.64242: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.64244: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.64247: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.64648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.65176: done with get_vars() 28011 1726882534.65187: done getting variables 28011 1726882534.65363: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:35:34 -0400 (0:00:00.056) 0:00:04.205 ****** 28011 1726882534.65399: entering _queue_task() for managed_node1/debug 28011 1726882534.65401: Creating lock for debug 28011 1726882534.66252: worker is 1 (out of 1 available) 28011 1726882534.66262: exiting _queue_task() for managed_node1/debug 28011 1726882534.66271: done queuing things up, now waiting for results queue to drain 28011 1726882534.66272: waiting for pending results... 28011 1726882534.66718: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 28011 1726882534.66722: in run() - task 12673a56-9f93-962d-7c65-00000000016a 28011 1726882534.66725: variable 'ansible_search_path' from source: unknown 28011 1726882534.66728: variable 'ansible_search_path' from source: unknown 28011 1726882534.67098: calling self._execute() 28011 1726882534.67103: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.67106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.67110: variable 'omit' from source: magic vars 28011 1726882534.67819: variable 'ansible_distribution_major_version' from source: facts 28011 1726882534.67837: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882534.67908: variable 'omit' from source: magic vars 28011 1726882534.67950: variable 'omit' from source: magic vars 28011 1726882534.68054: variable 'current_interfaces' from source: set_fact 28011 1726882534.68199: variable 'omit' from source: magic vars 28011 1726882534.68324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882534.68363: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882534.68421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882534.68699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882534.68702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882534.68705: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882534.68707: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.68709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.68778: Set connection var ansible_connection to ssh 28011 1726882534.68907: Set connection var ansible_pipelining to False 28011 1726882534.68918: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882534.68927: Set connection var ansible_shell_executable to /bin/sh 28011 1726882534.68938: Set connection var ansible_timeout to 10 28011 1726882534.68947: Set connection var ansible_shell_type to sh 28011 1726882534.68975: variable 'ansible_shell_executable' from source: unknown 28011 1726882534.68983: variable 'ansible_connection' from source: unknown 28011 1726882534.68996: variable 'ansible_module_compression' from source: unknown 28011 1726882534.69004: variable 'ansible_shell_type' from source: unknown 28011 1726882534.69011: variable 'ansible_shell_executable' from source: unknown 28011 1726882534.69017: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.69023: variable 'ansible_pipelining' from source: unknown 28011 1726882534.69084: variable 'ansible_timeout' from source: unknown 28011 1726882534.69100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.69298: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882534.69407: variable 'omit' from source: magic vars 28011 1726882534.69416: starting attempt loop 28011 1726882534.69799: running the handler 28011 1726882534.69802: handler run complete 28011 1726882534.69805: attempt loop complete, returning result 28011 1726882534.69807: _execute() done 28011 1726882534.69809: dumping result to json 28011 1726882534.69811: done dumping result, returning 28011 1726882534.69814: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [12673a56-9f93-962d-7c65-00000000016a] 28011 1726882534.69816: sending task result for task 12673a56-9f93-962d-7c65-00000000016a 28011 1726882534.69885: done sending task result for task 12673a56-9f93-962d-7c65-00000000016a 28011 1726882534.69892: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 28011 1726882534.69941: no more pending results, returning what we have 28011 1726882534.69949: results queue empty 28011 1726882534.69950: checking for any_errors_fatal 28011 1726882534.69954: done checking for any_errors_fatal 28011 1726882534.69955: checking for max_fail_percentage 28011 1726882534.69957: done checking for max_fail_percentage 28011 1726882534.69958: checking to see if all hosts have failed and the running result is not ok 28011 1726882534.69959: done checking to see if all hosts have failed 28011 1726882534.69960: getting the remaining hosts for this loop 28011 1726882534.69961: done getting the remaining hosts for this loop 28011 1726882534.69964: getting the next task for host managed_node1 28011 1726882534.69971: done getting next task for host managed_node1 28011 1726882534.69974: ^ task is: TASK: Include the task 'manage_test_interface.yml' 28011 1726882534.69975: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882534.69980: getting variables 28011 1726882534.69981: in VariableManager get_vars() 28011 1726882534.70023: Calling all_inventory to load vars for managed_node1 28011 1726882534.70026: Calling groups_inventory to load vars for managed_node1 28011 1726882534.70029: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.70040: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.70043: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.70046: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.70573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.70774: done with get_vars() 28011 1726882534.70785: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:17 Friday 20 September 2024 21:35:34 -0400 (0:00:00.055) 0:00:04.260 ****** 28011 1726882534.70901: entering _queue_task() for managed_node1/include_tasks 28011 1726882534.71173: worker is 1 (out of 1 available) 28011 1726882534.71185: exiting _queue_task() for managed_node1/include_tasks 28011 1726882534.71359: done queuing things up, now waiting for results queue to drain 28011 1726882534.71362: waiting for pending results... 28011 1726882534.71482: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 28011 1726882534.71577: in run() - task 12673a56-9f93-962d-7c65-00000000000d 28011 1726882534.71603: variable 'ansible_search_path' from source: unknown 28011 1726882534.71645: calling self._execute() 28011 1726882534.71733: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.71760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.71778: variable 'omit' from source: magic vars 28011 1726882534.72162: variable 'ansible_distribution_major_version' from source: facts 28011 1726882534.72179: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882534.72189: _execute() done 28011 1726882534.72202: dumping result to json 28011 1726882534.72210: done dumping result, returning 28011 1726882534.72220: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [12673a56-9f93-962d-7c65-00000000000d] 28011 1726882534.72231: sending task result for task 12673a56-9f93-962d-7c65-00000000000d 28011 1726882534.72518: no more pending results, returning what we have 28011 1726882534.72523: in VariableManager get_vars() 28011 1726882534.72563: Calling all_inventory to load vars for managed_node1 28011 1726882534.72566: Calling groups_inventory to load vars for managed_node1 28011 1726882534.72569: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.72579: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.72582: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.72585: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.72802: done sending task result for task 12673a56-9f93-962d-7c65-00000000000d 28011 1726882534.72806: WORKER PROCESS EXITING 28011 1726882534.72828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.73030: done with get_vars() 28011 1726882534.73038: variable 'ansible_search_path' from source: unknown 28011 1726882534.73051: we have included files to process 28011 1726882534.73053: generating all_blocks data 28011 1726882534.73054: done generating all_blocks data 28011 1726882534.73057: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 28011 1726882534.73058: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 28011 1726882534.73061: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 28011 1726882534.73579: in VariableManager get_vars() 28011 1726882534.73607: done with get_vars() 28011 1726882534.73838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28011 1726882534.75057: done processing included file 28011 1726882534.75059: iterating over new_blocks loaded from include file 28011 1726882534.75060: in VariableManager get_vars() 28011 1726882534.75079: done with get_vars() 28011 1726882534.75081: filtering new block on tags 28011 1726882534.75118: done filtering new block on tags 28011 1726882534.75121: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 28011 1726882534.75141: extending task lists for all hosts with included blocks 28011 1726882534.79570: done extending task lists 28011 1726882534.79572: done processing included files 28011 1726882534.79573: results queue empty 28011 1726882534.79574: checking for any_errors_fatal 28011 1726882534.79577: done checking for any_errors_fatal 28011 1726882534.79578: checking for max_fail_percentage 28011 1726882534.79579: done checking for max_fail_percentage 28011 1726882534.79580: checking to see if all hosts have failed and the running result is not ok 28011 1726882534.79581: done checking to see if all hosts have failed 28011 1726882534.79581: getting the remaining hosts for this loop 28011 1726882534.79583: done getting the remaining hosts for this loop 28011 1726882534.79585: getting the next task for host managed_node1 28011 1726882534.79592: done getting next task for host managed_node1 28011 1726882534.79598: ^ task is: TASK: Ensure state in ["present", "absent"] 28011 1726882534.79601: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882534.79603: getting variables 28011 1726882534.79604: in VariableManager get_vars() 28011 1726882534.79624: Calling all_inventory to load vars for managed_node1 28011 1726882534.79627: Calling groups_inventory to load vars for managed_node1 28011 1726882534.79629: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.79636: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.79639: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.79642: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.80002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.80400: done with get_vars() 28011 1726882534.80411: done getting variables 28011 1726882534.80479: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:35:34 -0400 (0:00:00.098) 0:00:04.358 ****** 28011 1726882534.80715: entering _queue_task() for managed_node1/fail 28011 1726882534.80717: Creating lock for fail 28011 1726882534.81235: worker is 1 (out of 1 available) 28011 1726882534.81245: exiting _queue_task() for managed_node1/fail 28011 1726882534.81258: done queuing things up, now waiting for results queue to drain 28011 1726882534.81259: waiting for pending results... 28011 1726882534.82013: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 28011 1726882534.82019: in run() - task 12673a56-9f93-962d-7c65-00000000026a 28011 1726882534.82022: variable 'ansible_search_path' from source: unknown 28011 1726882534.82024: variable 'ansible_search_path' from source: unknown 28011 1726882534.82210: calling self._execute() 28011 1726882534.82295: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.82598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.82603: variable 'omit' from source: magic vars 28011 1726882534.83086: variable 'ansible_distribution_major_version' from source: facts 28011 1726882534.83298: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882534.83682: variable 'state' from source: include params 28011 1726882534.83689: Evaluated conditional (state not in ["present", "absent"]): False 28011 1726882534.83697: when evaluation is False, skipping this task 28011 1726882534.83700: _execute() done 28011 1726882534.83703: dumping result to json 28011 1726882534.83705: done dumping result, returning 28011 1726882534.83708: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [12673a56-9f93-962d-7c65-00000000026a] 28011 1726882534.83712: sending task result for task 12673a56-9f93-962d-7c65-00000000026a skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 28011 1726882534.83860: no more pending results, returning what we have 28011 1726882534.83864: results queue empty 28011 1726882534.83864: checking for any_errors_fatal 28011 1726882534.83866: done checking for any_errors_fatal 28011 1726882534.83867: checking for max_fail_percentage 28011 1726882534.83868: done checking for max_fail_percentage 28011 1726882534.83869: checking to see if all hosts have failed and the running result is not ok 28011 1726882534.83870: done checking to see if all hosts have failed 28011 1726882534.83870: getting the remaining hosts for this loop 28011 1726882534.83872: done getting the remaining hosts for this loop 28011 1726882534.83875: getting the next task for host managed_node1 28011 1726882534.83881: done getting next task for host managed_node1 28011 1726882534.83884: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 28011 1726882534.83887: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882534.83896: getting variables 28011 1726882534.83898: in VariableManager get_vars() 28011 1726882534.83944: Calling all_inventory to load vars for managed_node1 28011 1726882534.83947: Calling groups_inventory to load vars for managed_node1 28011 1726882534.83950: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.83957: done sending task result for task 12673a56-9f93-962d-7c65-00000000026a 28011 1726882534.83960: WORKER PROCESS EXITING 28011 1726882534.83974: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.83977: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.83980: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.84381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.84936: done with get_vars() 28011 1726882534.84946: done getting variables 28011 1726882534.85312: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:35:34 -0400 (0:00:00.046) 0:00:04.404 ****** 28011 1726882534.85341: entering _queue_task() for managed_node1/fail 28011 1726882534.85803: worker is 1 (out of 1 available) 28011 1726882534.85815: exiting _queue_task() for managed_node1/fail 28011 1726882534.85826: done queuing things up, now waiting for results queue to drain 28011 1726882534.85828: waiting for pending results... 28011 1726882534.86511: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 28011 1726882534.86516: in run() - task 12673a56-9f93-962d-7c65-00000000026b 28011 1726882534.86520: variable 'ansible_search_path' from source: unknown 28011 1726882534.86523: variable 'ansible_search_path' from source: unknown 28011 1726882534.86628: calling self._execute() 28011 1726882534.86803: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.86816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.86911: variable 'omit' from source: magic vars 28011 1726882534.87705: variable 'ansible_distribution_major_version' from source: facts 28011 1726882534.87724: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882534.87970: variable 'type' from source: set_fact 28011 1726882534.87981: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 28011 1726882534.87988: when evaluation is False, skipping this task 28011 1726882534.88000: _execute() done 28011 1726882534.88031: dumping result to json 28011 1726882534.88041: done dumping result, returning 28011 1726882534.88052: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [12673a56-9f93-962d-7c65-00000000026b] 28011 1726882534.88198: sending task result for task 12673a56-9f93-962d-7c65-00000000026b 28011 1726882534.88263: done sending task result for task 12673a56-9f93-962d-7c65-00000000026b 28011 1726882534.88266: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 28011 1726882534.88318: no more pending results, returning what we have 28011 1726882534.88322: results queue empty 28011 1726882534.88323: checking for any_errors_fatal 28011 1726882534.88329: done checking for any_errors_fatal 28011 1726882534.88330: checking for max_fail_percentage 28011 1726882534.88331: done checking for max_fail_percentage 28011 1726882534.88332: checking to see if all hosts have failed and the running result is not ok 28011 1726882534.88333: done checking to see if all hosts have failed 28011 1726882534.88333: getting the remaining hosts for this loop 28011 1726882534.88336: done getting the remaining hosts for this loop 28011 1726882534.88339: getting the next task for host managed_node1 28011 1726882534.88346: done getting next task for host managed_node1 28011 1726882534.88349: ^ task is: TASK: Include the task 'show_interfaces.yml' 28011 1726882534.88352: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882534.88357: getting variables 28011 1726882534.88359: in VariableManager get_vars() 28011 1726882534.88405: Calling all_inventory to load vars for managed_node1 28011 1726882534.88408: Calling groups_inventory to load vars for managed_node1 28011 1726882534.88411: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.88424: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.88427: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.88430: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.89105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.89506: done with get_vars() 28011 1726882534.89516: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:35:34 -0400 (0:00:00.044) 0:00:04.449 ****** 28011 1726882534.89806: entering _queue_task() for managed_node1/include_tasks 28011 1726882534.90231: worker is 1 (out of 1 available) 28011 1726882534.90242: exiting _queue_task() for managed_node1/include_tasks 28011 1726882534.90253: done queuing things up, now waiting for results queue to drain 28011 1726882534.90254: waiting for pending results... 28011 1726882534.90604: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 28011 1726882534.90805: in run() - task 12673a56-9f93-962d-7c65-00000000026c 28011 1726882534.90900: variable 'ansible_search_path' from source: unknown 28011 1726882534.90903: variable 'ansible_search_path' from source: unknown 28011 1726882534.90906: calling self._execute() 28011 1726882534.90962: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.90973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.90986: variable 'omit' from source: magic vars 28011 1726882534.91459: variable 'ansible_distribution_major_version' from source: facts 28011 1726882534.91462: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882534.91469: _execute() done 28011 1726882534.91477: dumping result to json 28011 1726882534.91568: done dumping result, returning 28011 1726882534.91571: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-962d-7c65-00000000026c] 28011 1726882534.91573: sending task result for task 12673a56-9f93-962d-7c65-00000000026c 28011 1726882534.91638: done sending task result for task 12673a56-9f93-962d-7c65-00000000026c 28011 1726882534.91641: WORKER PROCESS EXITING 28011 1726882534.91816: no more pending results, returning what we have 28011 1726882534.91820: in VariableManager get_vars() 28011 1726882534.91855: Calling all_inventory to load vars for managed_node1 28011 1726882534.91858: Calling groups_inventory to load vars for managed_node1 28011 1726882534.91860: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.91869: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.91871: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.91874: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.92161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.92525: done with get_vars() 28011 1726882534.92533: variable 'ansible_search_path' from source: unknown 28011 1726882534.92534: variable 'ansible_search_path' from source: unknown 28011 1726882534.92567: we have included files to process 28011 1726882534.92569: generating all_blocks data 28011 1726882534.92571: done generating all_blocks data 28011 1726882534.92574: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28011 1726882534.92575: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28011 1726882534.92577: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28011 1726882534.92681: in VariableManager get_vars() 28011 1726882534.92710: done with get_vars() 28011 1726882534.92822: done processing included file 28011 1726882534.92824: iterating over new_blocks loaded from include file 28011 1726882534.92825: in VariableManager get_vars() 28011 1726882534.92843: done with get_vars() 28011 1726882534.92845: filtering new block on tags 28011 1726882534.92862: done filtering new block on tags 28011 1726882534.92864: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 28011 1726882534.92869: extending task lists for all hosts with included blocks 28011 1726882534.93266: done extending task lists 28011 1726882534.93267: done processing included files 28011 1726882534.93268: results queue empty 28011 1726882534.93269: checking for any_errors_fatal 28011 1726882534.93271: done checking for any_errors_fatal 28011 1726882534.93272: checking for max_fail_percentage 28011 1726882534.93273: done checking for max_fail_percentage 28011 1726882534.93274: checking to see if all hosts have failed and the running result is not ok 28011 1726882534.93275: done checking to see if all hosts have failed 28011 1726882534.93276: getting the remaining hosts for this loop 28011 1726882534.93277: done getting the remaining hosts for this loop 28011 1726882534.93279: getting the next task for host managed_node1 28011 1726882534.93283: done getting next task for host managed_node1 28011 1726882534.93286: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28011 1726882534.93288: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882534.93295: getting variables 28011 1726882534.93296: in VariableManager get_vars() 28011 1726882534.93309: Calling all_inventory to load vars for managed_node1 28011 1726882534.93311: Calling groups_inventory to load vars for managed_node1 28011 1726882534.93314: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.93318: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.93321: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.93323: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.93483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.93670: done with get_vars() 28011 1726882534.93679: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:35:34 -0400 (0:00:00.039) 0:00:04.488 ****** 28011 1726882534.93753: entering _queue_task() for managed_node1/include_tasks 28011 1726882534.93992: worker is 1 (out of 1 available) 28011 1726882534.94207: exiting _queue_task() for managed_node1/include_tasks 28011 1726882534.94218: done queuing things up, now waiting for results queue to drain 28011 1726882534.94222: waiting for pending results... 28011 1726882534.94511: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 28011 1726882534.94585: in run() - task 12673a56-9f93-962d-7c65-000000000369 28011 1726882534.94625: variable 'ansible_search_path' from source: unknown 28011 1726882534.94639: variable 'ansible_search_path' from source: unknown 28011 1726882534.94680: calling self._execute() 28011 1726882534.94773: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.94784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.94854: variable 'omit' from source: magic vars 28011 1726882534.95172: variable 'ansible_distribution_major_version' from source: facts 28011 1726882534.95200: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882534.95292: _execute() done 28011 1726882534.95297: dumping result to json 28011 1726882534.95299: done dumping result, returning 28011 1726882534.95302: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-962d-7c65-000000000369] 28011 1726882534.95304: sending task result for task 12673a56-9f93-962d-7c65-000000000369 28011 1726882534.95363: done sending task result for task 12673a56-9f93-962d-7c65-000000000369 28011 1726882534.95367: WORKER PROCESS EXITING 28011 1726882534.95421: no more pending results, returning what we have 28011 1726882534.95425: in VariableManager get_vars() 28011 1726882534.95609: Calling all_inventory to load vars for managed_node1 28011 1726882534.95611: Calling groups_inventory to load vars for managed_node1 28011 1726882534.95614: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.95623: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.95626: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.95629: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.95843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.96058: done with get_vars() 28011 1726882534.96066: variable 'ansible_search_path' from source: unknown 28011 1726882534.96067: variable 'ansible_search_path' from source: unknown 28011 1726882534.96122: we have included files to process 28011 1726882534.96124: generating all_blocks data 28011 1726882534.96125: done generating all_blocks data 28011 1726882534.96126: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28011 1726882534.96127: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28011 1726882534.96129: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28011 1726882534.96358: done processing included file 28011 1726882534.96359: iterating over new_blocks loaded from include file 28011 1726882534.96361: in VariableManager get_vars() 28011 1726882534.96377: done with get_vars() 28011 1726882534.96379: filtering new block on tags 28011 1726882534.96400: done filtering new block on tags 28011 1726882534.96402: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 28011 1726882534.96406: extending task lists for all hosts with included blocks 28011 1726882534.96551: done extending task lists 28011 1726882534.96552: done processing included files 28011 1726882534.96553: results queue empty 28011 1726882534.96553: checking for any_errors_fatal 28011 1726882534.96557: done checking for any_errors_fatal 28011 1726882534.96557: checking for max_fail_percentage 28011 1726882534.96558: done checking for max_fail_percentage 28011 1726882534.96559: checking to see if all hosts have failed and the running result is not ok 28011 1726882534.96560: done checking to see if all hosts have failed 28011 1726882534.96561: getting the remaining hosts for this loop 28011 1726882534.96562: done getting the remaining hosts for this loop 28011 1726882534.96564: getting the next task for host managed_node1 28011 1726882534.96568: done getting next task for host managed_node1 28011 1726882534.96570: ^ task is: TASK: Gather current interface info 28011 1726882534.96573: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882534.96575: getting variables 28011 1726882534.96576: in VariableManager get_vars() 28011 1726882534.96588: Calling all_inventory to load vars for managed_node1 28011 1726882534.96595: Calling groups_inventory to load vars for managed_node1 28011 1726882534.96598: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882534.96603: Calling all_plugins_play to load vars for managed_node1 28011 1726882534.96605: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882534.96608: Calling groups_plugins_play to load vars for managed_node1 28011 1726882534.96769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882534.96951: done with get_vars() 28011 1726882534.96960: done getting variables 28011 1726882534.97002: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:35:34 -0400 (0:00:00.032) 0:00:04.521 ****** 28011 1726882534.97031: entering _queue_task() for managed_node1/command 28011 1726882534.97271: worker is 1 (out of 1 available) 28011 1726882534.97282: exiting _queue_task() for managed_node1/command 28011 1726882534.97401: done queuing things up, now waiting for results queue to drain 28011 1726882534.97403: waiting for pending results... 28011 1726882534.97610: running TaskExecutor() for managed_node1/TASK: Gather current interface info 28011 1726882534.97669: in run() - task 12673a56-9f93-962d-7c65-0000000003a0 28011 1726882534.97692: variable 'ansible_search_path' from source: unknown 28011 1726882534.97704: variable 'ansible_search_path' from source: unknown 28011 1726882534.97766: calling self._execute() 28011 1726882534.97874: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.97884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.98200: variable 'omit' from source: magic vars 28011 1726882534.98642: variable 'ansible_distribution_major_version' from source: facts 28011 1726882534.98660: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882534.98671: variable 'omit' from source: magic vars 28011 1726882534.98755: variable 'omit' from source: magic vars 28011 1726882534.98797: variable 'omit' from source: magic vars 28011 1726882534.98841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882534.98883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882534.98915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882534.98978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882534.99011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882534.99104: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882534.99114: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.99123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.99501: Set connection var ansible_connection to ssh 28011 1726882534.99504: Set connection var ansible_pipelining to False 28011 1726882534.99506: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882534.99509: Set connection var ansible_shell_executable to /bin/sh 28011 1726882534.99511: Set connection var ansible_timeout to 10 28011 1726882534.99513: Set connection var ansible_shell_type to sh 28011 1726882534.99515: variable 'ansible_shell_executable' from source: unknown 28011 1726882534.99517: variable 'ansible_connection' from source: unknown 28011 1726882534.99519: variable 'ansible_module_compression' from source: unknown 28011 1726882534.99521: variable 'ansible_shell_type' from source: unknown 28011 1726882534.99523: variable 'ansible_shell_executable' from source: unknown 28011 1726882534.99525: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882534.99527: variable 'ansible_pipelining' from source: unknown 28011 1726882534.99529: variable 'ansible_timeout' from source: unknown 28011 1726882534.99531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882534.99827: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882534.99844: variable 'omit' from source: magic vars 28011 1726882534.99854: starting attempt loop 28011 1726882534.99999: running the handler 28011 1726882535.00003: _low_level_execute_command(): starting 28011 1726882535.00005: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882535.00906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882535.00927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.00977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882535.01001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882535.01045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882535.02750: stdout chunk (state=3): >>>/root <<< 28011 1726882535.02897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882535.02900: stdout chunk (state=3): >>><<< 28011 1726882535.02903: stderr chunk (state=3): >>><<< 28011 1726882535.03087: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882535.03090: _low_level_execute_command(): starting 28011 1726882535.03094: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116 `" && echo ansible-tmp-1726882535.029766-28256-67621039098116="` echo /root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116 `" ) && sleep 0' 28011 1726882535.04013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882535.04018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.04048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.04101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882535.04104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882535.04155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882535.06213: stdout chunk (state=3): >>>ansible-tmp-1726882535.029766-28256-67621039098116=/root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116 <<< 28011 1726882535.06217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882535.06399: stderr chunk (state=3): >>><<< 28011 1726882535.06403: stdout chunk (state=3): >>><<< 28011 1726882535.06406: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882535.029766-28256-67621039098116=/root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882535.06408: variable 'ansible_module_compression' from source: unknown 28011 1726882535.06410: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28011 1726882535.06426: variable 'ansible_facts' from source: unknown 28011 1726882535.06638: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116/AnsiballZ_command.py 28011 1726882535.06879: Sending initial data 28011 1726882535.07009: Sent initial data (154 bytes) 28011 1726882535.07389: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882535.07407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882535.07423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882535.07442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882535.07508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.07545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882535.07560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882535.07582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882535.07653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882535.09164: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882535.09225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882535.09279: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpiwv6_eid /root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116/AnsiballZ_command.py <<< 28011 1726882535.09285: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116/AnsiballZ_command.py" <<< 28011 1726882535.09312: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpiwv6_eid" to remote "/root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116/AnsiballZ_command.py" <<< 28011 1726882535.10065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882535.10069: stderr chunk (state=3): >>><<< 28011 1726882535.10072: stdout chunk (state=3): >>><<< 28011 1726882535.10116: done transferring module to remote 28011 1726882535.10124: _low_level_execute_command(): starting 28011 1726882535.10131: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116/ /root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116/AnsiballZ_command.py && sleep 0' 28011 1726882535.10735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.10748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882535.10759: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882535.10811: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.10864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882535.10879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882535.10905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882535.10999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882535.12679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882535.12704: stderr chunk (state=3): >>><<< 28011 1726882535.12708: stdout chunk (state=3): >>><<< 28011 1726882535.12720: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882535.12723: _low_level_execute_command(): starting 28011 1726882535.12728: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116/AnsiballZ_command.py && sleep 0' 28011 1726882535.13309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.13347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882535.13361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882535.13377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882535.13463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882535.28918: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:35.284868", "end": "2024-09-20 21:35:35.288151", "delta": "0:00:00.003283", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882535.30419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882535.30482: stderr chunk (state=3): >>><<< 28011 1726882535.30486: stdout chunk (state=3): >>><<< 28011 1726882535.30631: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:35.284868", "end": "2024-09-20 21:35:35.288151", "delta": "0:00:00.003283", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882535.30635: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882535.30637: _low_level_execute_command(): starting 28011 1726882535.30639: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882535.029766-28256-67621039098116/ > /dev/null 2>&1 && sleep 0' 28011 1726882535.31315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882535.31319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882535.31321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.31323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882535.31325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.31408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882535.31448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882535.33308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882535.33321: stderr chunk (state=3): >>><<< 28011 1726882535.33326: stdout chunk (state=3): >>><<< 28011 1726882535.33353: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882535.33398: handler run complete 28011 1726882535.33401: Evaluated conditional (False): False 28011 1726882535.33404: attempt loop complete, returning result 28011 1726882535.33406: _execute() done 28011 1726882535.33408: dumping result to json 28011 1726882535.33410: done dumping result, returning 28011 1726882535.33479: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [12673a56-9f93-962d-7c65-0000000003a0] 28011 1726882535.33482: sending task result for task 12673a56-9f93-962d-7c65-0000000003a0 28011 1726882535.33558: done sending task result for task 12673a56-9f93-962d-7c65-0000000003a0 28011 1726882535.33561: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003283", "end": "2024-09-20 21:35:35.288151", "rc": 0, "start": "2024-09-20 21:35:35.284868" } STDOUT: bonding_masters eth0 lo 28011 1726882535.33768: no more pending results, returning what we have 28011 1726882535.33771: results queue empty 28011 1726882535.33771: checking for any_errors_fatal 28011 1726882535.33773: done checking for any_errors_fatal 28011 1726882535.33774: checking for max_fail_percentage 28011 1726882535.33775: done checking for max_fail_percentage 28011 1726882535.33776: checking to see if all hosts have failed and the running result is not ok 28011 1726882535.33776: done checking to see if all hosts have failed 28011 1726882535.33777: getting the remaining hosts for this loop 28011 1726882535.33778: done getting the remaining hosts for this loop 28011 1726882535.33781: getting the next task for host managed_node1 28011 1726882535.33788: done getting next task for host managed_node1 28011 1726882535.33794: ^ task is: TASK: Set current_interfaces 28011 1726882535.33799: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882535.33807: getting variables 28011 1726882535.33808: in VariableManager get_vars() 28011 1726882535.33846: Calling all_inventory to load vars for managed_node1 28011 1726882535.33849: Calling groups_inventory to load vars for managed_node1 28011 1726882535.33851: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882535.33861: Calling all_plugins_play to load vars for managed_node1 28011 1726882535.33864: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882535.33866: Calling groups_plugins_play to load vars for managed_node1 28011 1726882535.34151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882535.34407: done with get_vars() 28011 1726882535.34419: done getting variables 28011 1726882535.34481: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:35:35 -0400 (0:00:00.374) 0:00:04.896 ****** 28011 1726882535.34516: entering _queue_task() for managed_node1/set_fact 28011 1726882535.34824: worker is 1 (out of 1 available) 28011 1726882535.34835: exiting _queue_task() for managed_node1/set_fact 28011 1726882535.34846: done queuing things up, now waiting for results queue to drain 28011 1726882535.34847: waiting for pending results... 28011 1726882535.35130: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 28011 1726882535.35199: in run() - task 12673a56-9f93-962d-7c65-0000000003a1 28011 1726882535.35204: variable 'ansible_search_path' from source: unknown 28011 1726882535.35207: variable 'ansible_search_path' from source: unknown 28011 1726882535.35337: calling self._execute() 28011 1726882535.35341: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882535.35350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882535.35366: variable 'omit' from source: magic vars 28011 1726882535.35771: variable 'ansible_distribution_major_version' from source: facts 28011 1726882535.35798: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882535.35810: variable 'omit' from source: magic vars 28011 1726882535.35869: variable 'omit' from source: magic vars 28011 1726882535.36000: variable '_current_interfaces' from source: set_fact 28011 1726882535.36068: variable 'omit' from source: magic vars 28011 1726882535.36123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882535.36163: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882535.36192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882535.36319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882535.36322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882535.36325: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882535.36327: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882535.36330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882535.36398: Set connection var ansible_connection to ssh 28011 1726882535.36412: Set connection var ansible_pipelining to False 28011 1726882535.36431: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882535.36442: Set connection var ansible_shell_executable to /bin/sh 28011 1726882535.36455: Set connection var ansible_timeout to 10 28011 1726882535.36465: Set connection var ansible_shell_type to sh 28011 1726882535.36496: variable 'ansible_shell_executable' from source: unknown 28011 1726882535.36506: variable 'ansible_connection' from source: unknown 28011 1726882535.36513: variable 'ansible_module_compression' from source: unknown 28011 1726882535.36521: variable 'ansible_shell_type' from source: unknown 28011 1726882535.36532: variable 'ansible_shell_executable' from source: unknown 28011 1726882535.36544: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882535.36552: variable 'ansible_pipelining' from source: unknown 28011 1726882535.36560: variable 'ansible_timeout' from source: unknown 28011 1726882535.36568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882535.36720: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882535.36753: variable 'omit' from source: magic vars 28011 1726882535.36756: starting attempt loop 28011 1726882535.36759: running the handler 28011 1726882535.36863: handler run complete 28011 1726882535.36866: attempt loop complete, returning result 28011 1726882535.36869: _execute() done 28011 1726882535.36871: dumping result to json 28011 1726882535.36873: done dumping result, returning 28011 1726882535.36876: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [12673a56-9f93-962d-7c65-0000000003a1] 28011 1726882535.36878: sending task result for task 12673a56-9f93-962d-7c65-0000000003a1 28011 1726882535.36947: done sending task result for task 12673a56-9f93-962d-7c65-0000000003a1 28011 1726882535.36951: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 28011 1726882535.37029: no more pending results, returning what we have 28011 1726882535.37033: results queue empty 28011 1726882535.37034: checking for any_errors_fatal 28011 1726882535.37042: done checking for any_errors_fatal 28011 1726882535.37043: checking for max_fail_percentage 28011 1726882535.37045: done checking for max_fail_percentage 28011 1726882535.37046: checking to see if all hosts have failed and the running result is not ok 28011 1726882535.37047: done checking to see if all hosts have failed 28011 1726882535.37047: getting the remaining hosts for this loop 28011 1726882535.37049: done getting the remaining hosts for this loop 28011 1726882535.37053: getting the next task for host managed_node1 28011 1726882535.37063: done getting next task for host managed_node1 28011 1726882535.37066: ^ task is: TASK: Show current_interfaces 28011 1726882535.37070: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882535.37299: getting variables 28011 1726882535.37302: in VariableManager get_vars() 28011 1726882535.37340: Calling all_inventory to load vars for managed_node1 28011 1726882535.37343: Calling groups_inventory to load vars for managed_node1 28011 1726882535.37345: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882535.37354: Calling all_plugins_play to load vars for managed_node1 28011 1726882535.37357: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882535.37360: Calling groups_plugins_play to load vars for managed_node1 28011 1726882535.37548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882535.37759: done with get_vars() 28011 1726882535.37769: done getting variables 28011 1726882535.37836: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:35:35 -0400 (0:00:00.033) 0:00:04.929 ****** 28011 1726882535.37864: entering _queue_task() for managed_node1/debug 28011 1726882535.38108: worker is 1 (out of 1 available) 28011 1726882535.38120: exiting _queue_task() for managed_node1/debug 28011 1726882535.38132: done queuing things up, now waiting for results queue to drain 28011 1726882535.38133: waiting for pending results... 28011 1726882535.38295: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 28011 1726882535.38362: in run() - task 12673a56-9f93-962d-7c65-00000000036a 28011 1726882535.38370: variable 'ansible_search_path' from source: unknown 28011 1726882535.38374: variable 'ansible_search_path' from source: unknown 28011 1726882535.38405: calling self._execute() 28011 1726882535.38464: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882535.38468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882535.38478: variable 'omit' from source: magic vars 28011 1726882535.38734: variable 'ansible_distribution_major_version' from source: facts 28011 1726882535.38744: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882535.38750: variable 'omit' from source: magic vars 28011 1726882535.38776: variable 'omit' from source: magic vars 28011 1726882535.38848: variable 'current_interfaces' from source: set_fact 28011 1726882535.38867: variable 'omit' from source: magic vars 28011 1726882535.38898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882535.38926: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882535.38942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882535.38955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882535.38964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882535.38987: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882535.38992: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882535.38997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882535.39062: Set connection var ansible_connection to ssh 28011 1726882535.39068: Set connection var ansible_pipelining to False 28011 1726882535.39074: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882535.39079: Set connection var ansible_shell_executable to /bin/sh 28011 1726882535.39085: Set connection var ansible_timeout to 10 28011 1726882535.39092: Set connection var ansible_shell_type to sh 28011 1726882535.39109: variable 'ansible_shell_executable' from source: unknown 28011 1726882535.39112: variable 'ansible_connection' from source: unknown 28011 1726882535.39115: variable 'ansible_module_compression' from source: unknown 28011 1726882535.39117: variable 'ansible_shell_type' from source: unknown 28011 1726882535.39121: variable 'ansible_shell_executable' from source: unknown 28011 1726882535.39123: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882535.39126: variable 'ansible_pipelining' from source: unknown 28011 1726882535.39129: variable 'ansible_timeout' from source: unknown 28011 1726882535.39132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882535.39228: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882535.39236: variable 'omit' from source: magic vars 28011 1726882535.39238: starting attempt loop 28011 1726882535.39250: running the handler 28011 1726882535.39280: handler run complete 28011 1726882535.39295: attempt loop complete, returning result 28011 1726882535.39298: _execute() done 28011 1726882535.39301: dumping result to json 28011 1726882535.39303: done dumping result, returning 28011 1726882535.39306: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [12673a56-9f93-962d-7c65-00000000036a] 28011 1726882535.39311: sending task result for task 12673a56-9f93-962d-7c65-00000000036a 28011 1726882535.39385: done sending task result for task 12673a56-9f93-962d-7c65-00000000036a 28011 1726882535.39387: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 28011 1726882535.39438: no more pending results, returning what we have 28011 1726882535.39441: results queue empty 28011 1726882535.39442: checking for any_errors_fatal 28011 1726882535.39445: done checking for any_errors_fatal 28011 1726882535.39446: checking for max_fail_percentage 28011 1726882535.39447: done checking for max_fail_percentage 28011 1726882535.39448: checking to see if all hosts have failed and the running result is not ok 28011 1726882535.39449: done checking to see if all hosts have failed 28011 1726882535.39449: getting the remaining hosts for this loop 28011 1726882535.39451: done getting the remaining hosts for this loop 28011 1726882535.39454: getting the next task for host managed_node1 28011 1726882535.39460: done getting next task for host managed_node1 28011 1726882535.39462: ^ task is: TASK: Install iproute 28011 1726882535.39465: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882535.39468: getting variables 28011 1726882535.39469: in VariableManager get_vars() 28011 1726882535.39505: Calling all_inventory to load vars for managed_node1 28011 1726882535.39508: Calling groups_inventory to load vars for managed_node1 28011 1726882535.39510: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882535.39518: Calling all_plugins_play to load vars for managed_node1 28011 1726882535.39520: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882535.39523: Calling groups_plugins_play to load vars for managed_node1 28011 1726882535.39698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882535.39882: done with get_vars() 28011 1726882535.39896: done getting variables 28011 1726882535.39947: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:35:35 -0400 (0:00:00.021) 0:00:04.951 ****** 28011 1726882535.39976: entering _queue_task() for managed_node1/package 28011 1726882535.40206: worker is 1 (out of 1 available) 28011 1726882535.40220: exiting _queue_task() for managed_node1/package 28011 1726882535.40232: done queuing things up, now waiting for results queue to drain 28011 1726882535.40234: waiting for pending results... 28011 1726882535.40601: running TaskExecutor() for managed_node1/TASK: Install iproute 28011 1726882535.40606: in run() - task 12673a56-9f93-962d-7c65-00000000026d 28011 1726882535.40613: variable 'ansible_search_path' from source: unknown 28011 1726882535.40621: variable 'ansible_search_path' from source: unknown 28011 1726882535.40663: calling self._execute() 28011 1726882535.40744: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882535.40754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882535.40757: variable 'omit' from source: magic vars 28011 1726882535.41026: variable 'ansible_distribution_major_version' from source: facts 28011 1726882535.41034: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882535.41040: variable 'omit' from source: magic vars 28011 1726882535.41065: variable 'omit' from source: magic vars 28011 1726882535.41194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882535.42999: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882535.43003: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882535.43005: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882535.43007: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882535.43015: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882535.43106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882535.43139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882535.43170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882535.43219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882535.43240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882535.43344: variable '__network_is_ostree' from source: set_fact 28011 1726882535.43359: variable 'omit' from source: magic vars 28011 1726882535.43394: variable 'omit' from source: magic vars 28011 1726882535.43427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882535.43461: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882535.43483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882535.43507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882535.43520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882535.43553: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882535.43561: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882535.43574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882535.43667: Set connection var ansible_connection to ssh 28011 1726882535.43684: Set connection var ansible_pipelining to False 28011 1726882535.43696: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882535.43706: Set connection var ansible_shell_executable to /bin/sh 28011 1726882535.43719: Set connection var ansible_timeout to 10 28011 1726882535.43727: Set connection var ansible_shell_type to sh 28011 1726882535.43753: variable 'ansible_shell_executable' from source: unknown 28011 1726882535.43761: variable 'ansible_connection' from source: unknown 28011 1726882535.43785: variable 'ansible_module_compression' from source: unknown 28011 1726882535.43788: variable 'ansible_shell_type' from source: unknown 28011 1726882535.43791: variable 'ansible_shell_executable' from source: unknown 28011 1726882535.43794: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882535.43796: variable 'ansible_pipelining' from source: unknown 28011 1726882535.43798: variable 'ansible_timeout' from source: unknown 28011 1726882535.43896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882535.43907: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882535.43923: variable 'omit' from source: magic vars 28011 1726882535.43932: starting attempt loop 28011 1726882535.43939: running the handler 28011 1726882535.43952: variable 'ansible_facts' from source: unknown 28011 1726882535.43959: variable 'ansible_facts' from source: unknown 28011 1726882535.44001: _low_level_execute_command(): starting 28011 1726882535.44024: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882535.44542: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882535.44549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882535.44552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.44555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882535.44557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882535.44559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.44603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882535.44606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882535.44608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882535.44661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882535.46299: stdout chunk (state=3): >>>/root <<< 28011 1726882535.46463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882535.46466: stdout chunk (state=3): >>><<< 28011 1726882535.46477: stderr chunk (state=3): >>><<< 28011 1726882535.46524: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882535.46615: _low_level_execute_command(): starting 28011 1726882535.46618: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770 `" && echo ansible-tmp-1726882535.4651356-28284-13303965445770="` echo /root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770 `" ) && sleep 0' 28011 1726882535.47154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882535.47168: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882535.47202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882535.47251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.47333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882535.47336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882535.47349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882535.47381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882535.49254: stdout chunk (state=3): >>>ansible-tmp-1726882535.4651356-28284-13303965445770=/root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770 <<< 28011 1726882535.49416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882535.49420: stdout chunk (state=3): >>><<< 28011 1726882535.49422: stderr chunk (state=3): >>><<< 28011 1726882535.49444: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882535.4651356-28284-13303965445770=/root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882535.49598: variable 'ansible_module_compression' from source: unknown 28011 1726882535.49603: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 28011 1726882535.49605: ANSIBALLZ: Acquiring lock 28011 1726882535.49607: ANSIBALLZ: Lock acquired: 139767565767152 28011 1726882535.49609: ANSIBALLZ: Creating module 28011 1726882535.59574: ANSIBALLZ: Writing module into payload 28011 1726882535.59740: ANSIBALLZ: Writing module 28011 1726882535.59757: ANSIBALLZ: Renaming module 28011 1726882535.59769: ANSIBALLZ: Done creating module 28011 1726882535.59784: variable 'ansible_facts' from source: unknown 28011 1726882535.59847: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770/AnsiballZ_dnf.py 28011 1726882535.59950: Sending initial data 28011 1726882535.59954: Sent initial data (151 bytes) 28011 1726882535.60400: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882535.60403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882535.60405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882535.60407: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882535.60409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.60462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882535.60466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882535.60516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882535.62113: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882535.62156: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882535.62211: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmptrjgj00i /root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770/AnsiballZ_dnf.py <<< 28011 1726882535.62214: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770/AnsiballZ_dnf.py" <<< 28011 1726882535.62250: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmptrjgj00i" to remote "/root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770/AnsiballZ_dnf.py" <<< 28011 1726882535.63384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882535.63387: stdout chunk (state=3): >>><<< 28011 1726882535.63391: stderr chunk (state=3): >>><<< 28011 1726882535.63397: done transferring module to remote 28011 1726882535.63400: _low_level_execute_command(): starting 28011 1726882535.63402: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770/ /root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770/AnsiballZ_dnf.py && sleep 0' 28011 1726882535.63970: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882535.63980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882535.64025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882535.65744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882535.65906: stderr chunk (state=3): >>><<< 28011 1726882535.65910: stdout chunk (state=3): >>><<< 28011 1726882535.65913: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882535.65915: _low_level_execute_command(): starting 28011 1726882535.65918: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770/AnsiballZ_dnf.py && sleep 0' 28011 1726882535.66519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882535.66534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882535.66549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882535.66580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882535.66611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.66684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882535.66724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882535.66743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882535.66779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882535.67118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.07742: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 28011 1726882536.11739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882536.11771: stderr chunk (state=3): >>><<< 28011 1726882536.11774: stdout chunk (state=3): >>><<< 28011 1726882536.11792: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882536.11828: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882536.11834: _low_level_execute_command(): starting 28011 1726882536.11839: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882535.4651356-28284-13303965445770/ > /dev/null 2>&1 && sleep 0' 28011 1726882536.12290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.12297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.12315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.12366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882536.12369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.12371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.12424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.14244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.14267: stderr chunk (state=3): >>><<< 28011 1726882536.14270: stdout chunk (state=3): >>><<< 28011 1726882536.14283: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882536.14289: handler run complete 28011 1726882536.14410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882536.14541: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882536.14573: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882536.14599: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882536.14622: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882536.14674: variable '__install_status' from source: unknown 28011 1726882536.14691: Evaluated conditional (__install_status is success): True 28011 1726882536.14707: attempt loop complete, returning result 28011 1726882536.14710: _execute() done 28011 1726882536.14712: dumping result to json 28011 1726882536.14717: done dumping result, returning 28011 1726882536.14724: done running TaskExecutor() for managed_node1/TASK: Install iproute [12673a56-9f93-962d-7c65-00000000026d] 28011 1726882536.14727: sending task result for task 12673a56-9f93-962d-7c65-00000000026d 28011 1726882536.14821: done sending task result for task 12673a56-9f93-962d-7c65-00000000026d 28011 1726882536.14823: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 28011 1726882536.14902: no more pending results, returning what we have 28011 1726882536.14905: results queue empty 28011 1726882536.14906: checking for any_errors_fatal 28011 1726882536.14910: done checking for any_errors_fatal 28011 1726882536.14911: checking for max_fail_percentage 28011 1726882536.14913: done checking for max_fail_percentage 28011 1726882536.14913: checking to see if all hosts have failed and the running result is not ok 28011 1726882536.14914: done checking to see if all hosts have failed 28011 1726882536.14915: getting the remaining hosts for this loop 28011 1726882536.14916: done getting the remaining hosts for this loop 28011 1726882536.14919: getting the next task for host managed_node1 28011 1726882536.14925: done getting next task for host managed_node1 28011 1726882536.14928: ^ task is: TASK: Create veth interface {{ interface }} 28011 1726882536.14930: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882536.14934: getting variables 28011 1726882536.14935: in VariableManager get_vars() 28011 1726882536.14974: Calling all_inventory to load vars for managed_node1 28011 1726882536.14976: Calling groups_inventory to load vars for managed_node1 28011 1726882536.14978: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882536.14988: Calling all_plugins_play to load vars for managed_node1 28011 1726882536.14990: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882536.15037: Calling groups_plugins_play to load vars for managed_node1 28011 1726882536.15200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882536.15327: done with get_vars() 28011 1726882536.15336: done getting variables 28011 1726882536.15375: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882536.15467: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:35:36 -0400 (0:00:00.755) 0:00:05.706 ****** 28011 1726882536.15489: entering _queue_task() for managed_node1/command 28011 1726882536.15699: worker is 1 (out of 1 available) 28011 1726882536.15714: exiting _queue_task() for managed_node1/command 28011 1726882536.15727: done queuing things up, now waiting for results queue to drain 28011 1726882536.15729: waiting for pending results... 28011 1726882536.15887: running TaskExecutor() for managed_node1/TASK: Create veth interface ethtest0 28011 1726882536.15955: in run() - task 12673a56-9f93-962d-7c65-00000000026e 28011 1726882536.15965: variable 'ansible_search_path' from source: unknown 28011 1726882536.15968: variable 'ansible_search_path' from source: unknown 28011 1726882536.16166: variable 'interface' from source: set_fact 28011 1726882536.16226: variable 'interface' from source: set_fact 28011 1726882536.16498: variable 'interface' from source: set_fact 28011 1726882536.16502: Loaded config def from plugin (lookup/items) 28011 1726882536.16505: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 28011 1726882536.16507: variable 'omit' from source: magic vars 28011 1726882536.16587: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882536.16609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882536.16624: variable 'omit' from source: magic vars 28011 1726882536.17185: variable 'ansible_distribution_major_version' from source: facts 28011 1726882536.17205: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882536.17402: variable 'type' from source: set_fact 28011 1726882536.17415: variable 'state' from source: include params 28011 1726882536.17424: variable 'interface' from source: set_fact 28011 1726882536.17427: variable 'current_interfaces' from source: set_fact 28011 1726882536.17433: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 28011 1726882536.17439: variable 'omit' from source: magic vars 28011 1726882536.17495: variable 'omit' from source: magic vars 28011 1726882536.17519: variable 'item' from source: unknown 28011 1726882536.17567: variable 'item' from source: unknown 28011 1726882536.17584: variable 'omit' from source: magic vars 28011 1726882536.17613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882536.17641: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882536.17656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882536.17669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882536.17679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882536.17709: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882536.17713: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882536.17715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882536.17776: Set connection var ansible_connection to ssh 28011 1726882536.17783: Set connection var ansible_pipelining to False 28011 1726882536.17791: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882536.17800: Set connection var ansible_shell_executable to /bin/sh 28011 1726882536.17805: Set connection var ansible_timeout to 10 28011 1726882536.17810: Set connection var ansible_shell_type to sh 28011 1726882536.17826: variable 'ansible_shell_executable' from source: unknown 28011 1726882536.17828: variable 'ansible_connection' from source: unknown 28011 1726882536.17830: variable 'ansible_module_compression' from source: unknown 28011 1726882536.17833: variable 'ansible_shell_type' from source: unknown 28011 1726882536.17835: variable 'ansible_shell_executable' from source: unknown 28011 1726882536.17838: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882536.17842: variable 'ansible_pipelining' from source: unknown 28011 1726882536.17844: variable 'ansible_timeout' from source: unknown 28011 1726882536.17849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882536.17945: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882536.17952: variable 'omit' from source: magic vars 28011 1726882536.17955: starting attempt loop 28011 1726882536.17958: running the handler 28011 1726882536.17972: _low_level_execute_command(): starting 28011 1726882536.17978: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882536.18468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.18472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882536.18475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882536.18477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.18529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882536.18536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.18538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.18576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.20154: stdout chunk (state=3): >>>/root <<< 28011 1726882536.20307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.20311: stdout chunk (state=3): >>><<< 28011 1726882536.20313: stderr chunk (state=3): >>><<< 28011 1726882536.20334: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882536.20433: _low_level_execute_command(): starting 28011 1726882536.20437: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027 `" && echo ansible-tmp-1726882536.2035005-28326-12180038938027="` echo /root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027 `" ) && sleep 0' 28011 1726882536.20956: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882536.20965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882536.20979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.21004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882536.21026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882536.21051: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.21109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.21155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882536.21174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.21197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.21273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.23116: stdout chunk (state=3): >>>ansible-tmp-1726882536.2035005-28326-12180038938027=/root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027 <<< 28011 1726882536.23267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.23279: stdout chunk (state=3): >>><<< 28011 1726882536.23312: stderr chunk (state=3): >>><<< 28011 1726882536.23332: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882536.2035005-28326-12180038938027=/root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882536.23366: variable 'ansible_module_compression' from source: unknown 28011 1726882536.23450: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28011 1726882536.23496: variable 'ansible_facts' from source: unknown 28011 1726882536.23610: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027/AnsiballZ_command.py 28011 1726882536.23817: Sending initial data 28011 1726882536.24106: Sent initial data (155 bytes) 28011 1726882536.24583: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882536.24600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882536.24617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.24637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882536.24714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.24750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882536.24766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.24788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.24855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.26372: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882536.26440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882536.26492: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpbdwb82fc /root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027/AnsiballZ_command.py <<< 28011 1726882536.26501: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027/AnsiballZ_command.py" <<< 28011 1726882536.26541: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpbdwb82fc" to remote "/root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027/AnsiballZ_command.py" <<< 28011 1726882536.27297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.27300: stdout chunk (state=3): >>><<< 28011 1726882536.27303: stderr chunk (state=3): >>><<< 28011 1726882536.27312: done transferring module to remote 28011 1726882536.27329: _low_level_execute_command(): starting 28011 1726882536.27338: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027/ /root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027/AnsiballZ_command.py && sleep 0' 28011 1726882536.27935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882536.27965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882536.28015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.28087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882536.28119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.28206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.29912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.29936: stderr chunk (state=3): >>><<< 28011 1726882536.29939: stdout chunk (state=3): >>><<< 28011 1726882536.29953: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882536.29956: _low_level_execute_command(): starting 28011 1726882536.29962: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027/AnsiballZ_command.py && sleep 0' 28011 1726882536.30364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.30368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.30370: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882536.30372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882536.30374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.30422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.30426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.30476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.46245: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 21:35:36.452153", "end": "2024-09-20 21:35:36.457951", "delta": "0:00:00.005798", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882536.48533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882536.48537: stdout chunk (state=3): >>><<< 28011 1726882536.48540: stderr chunk (state=3): >>><<< 28011 1726882536.48680: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 21:35:36.452153", "end": "2024-09-20 21:35:36.457951", "delta": "0:00:00.005798", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882536.48683: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882536.48691: _low_level_execute_command(): starting 28011 1726882536.48695: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882536.2035005-28326-12180038938027/ > /dev/null 2>&1 && sleep 0' 28011 1726882536.49335: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882536.49366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882536.49480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882536.49516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.49534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.49621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.53944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.53955: stdout chunk (state=3): >>><<< 28011 1726882536.53963: stderr chunk (state=3): >>><<< 28011 1726882536.54002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882536.54006: handler run complete 28011 1726882536.54008: Evaluated conditional (False): False 28011 1726882536.54010: attempt loop complete, returning result 28011 1726882536.54027: variable 'item' from source: unknown 28011 1726882536.54088: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.005798", "end": "2024-09-20 21:35:36.457951", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-20 21:35:36.452153" } 28011 1726882536.54257: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882536.54260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882536.54263: variable 'omit' from source: magic vars 28011 1726882536.54334: variable 'ansible_distribution_major_version' from source: facts 28011 1726882536.54338: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882536.54455: variable 'type' from source: set_fact 28011 1726882536.54458: variable 'state' from source: include params 28011 1726882536.54461: variable 'interface' from source: set_fact 28011 1726882536.54466: variable 'current_interfaces' from source: set_fact 28011 1726882536.54471: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 28011 1726882536.54476: variable 'omit' from source: magic vars 28011 1726882536.54494: variable 'omit' from source: magic vars 28011 1726882536.54518: variable 'item' from source: unknown 28011 1726882536.54560: variable 'item' from source: unknown 28011 1726882536.54571: variable 'omit' from source: magic vars 28011 1726882536.54587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882536.54596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882536.54606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882536.54615: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882536.54618: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882536.54620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882536.54665: Set connection var ansible_connection to ssh 28011 1726882536.54668: Set connection var ansible_pipelining to False 28011 1726882536.54674: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882536.54679: Set connection var ansible_shell_executable to /bin/sh 28011 1726882536.54685: Set connection var ansible_timeout to 10 28011 1726882536.54692: Set connection var ansible_shell_type to sh 28011 1726882536.54715: variable 'ansible_shell_executable' from source: unknown 28011 1726882536.54718: variable 'ansible_connection' from source: unknown 28011 1726882536.54720: variable 'ansible_module_compression' from source: unknown 28011 1726882536.54722: variable 'ansible_shell_type' from source: unknown 28011 1726882536.54724: variable 'ansible_shell_executable' from source: unknown 28011 1726882536.54726: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882536.54728: variable 'ansible_pipelining' from source: unknown 28011 1726882536.54730: variable 'ansible_timeout' from source: unknown 28011 1726882536.54732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882536.54795: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882536.54801: variable 'omit' from source: magic vars 28011 1726882536.54805: starting attempt loop 28011 1726882536.54807: running the handler 28011 1726882536.54815: _low_level_execute_command(): starting 28011 1726882536.54818: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882536.55247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.55250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882536.55252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.55261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882536.55263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882536.55265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.55313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882536.55316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.55320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.55365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.56935: stdout chunk (state=3): >>>/root <<< 28011 1726882536.57024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.57051: stderr chunk (state=3): >>><<< 28011 1726882536.57053: stdout chunk (state=3): >>><<< 28011 1726882536.57098: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882536.57100: _low_level_execute_command(): starting 28011 1726882536.57102: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470 `" && echo ansible-tmp-1726882536.5706801-28326-169947610775470="` echo /root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470 `" ) && sleep 0' 28011 1726882536.57456: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.57469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882536.57479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.57526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882536.57539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.57585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.59427: stdout chunk (state=3): >>>ansible-tmp-1726882536.5706801-28326-169947610775470=/root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470 <<< 28011 1726882536.59627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.59630: stdout chunk (state=3): >>><<< 28011 1726882536.59632: stderr chunk (state=3): >>><<< 28011 1726882536.59645: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882536.5706801-28326-169947610775470=/root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882536.59701: variable 'ansible_module_compression' from source: unknown 28011 1726882536.59724: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28011 1726882536.59749: variable 'ansible_facts' from source: unknown 28011 1726882536.59891: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470/AnsiballZ_command.py 28011 1726882536.59976: Sending initial data 28011 1726882536.59979: Sent initial data (156 bytes) 28011 1726882536.60383: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882536.60386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882536.60388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.60438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.60449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.60486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.61999: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882536.62045: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882536.62096: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpnsqedj3q /root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470/AnsiballZ_command.py <<< 28011 1726882536.62100: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470/AnsiballZ_command.py" <<< 28011 1726882536.62147: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpnsqedj3q" to remote "/root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470/AnsiballZ_command.py" <<< 28011 1726882536.62717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.62752: stderr chunk (state=3): >>><<< 28011 1726882536.62755: stdout chunk (state=3): >>><<< 28011 1726882536.62779: done transferring module to remote 28011 1726882536.62785: _low_level_execute_command(): starting 28011 1726882536.62790: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470/ /root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470/AnsiballZ_command.py && sleep 0' 28011 1726882536.63155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.63192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882536.63198: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.63200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882536.63202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882536.63204: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.63243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.63246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.63299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.64986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.65010: stderr chunk (state=3): >>><<< 28011 1726882536.65013: stdout chunk (state=3): >>><<< 28011 1726882536.65026: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882536.65030: _low_level_execute_command(): starting 28011 1726882536.65033: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470/AnsiballZ_command.py && sleep 0' 28011 1726882536.65419: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.65422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.65425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.65426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.65476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.65482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.65526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.81010: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 21:35:36.805019", "end": "2024-09-20 21:35:36.809049", "delta": "0:00:00.004030", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882536.82509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882536.82517: stdout chunk (state=3): >>><<< 28011 1726882536.82520: stderr chunk (state=3): >>><<< 28011 1726882536.82532: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 21:35:36.805019", "end": "2024-09-20 21:35:36.809049", "delta": "0:00:00.004030", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882536.82560: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882536.82564: _low_level_execute_command(): starting 28011 1726882536.82570: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882536.5706801-28326-169947610775470/ > /dev/null 2>&1 && sleep 0' 28011 1726882536.83028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.83031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.83034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882536.83037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882536.83040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.83087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882536.83099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.83136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.84921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.84944: stderr chunk (state=3): >>><<< 28011 1726882536.84947: stdout chunk (state=3): >>><<< 28011 1726882536.84961: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882536.84966: handler run complete 28011 1726882536.84982: Evaluated conditional (False): False 28011 1726882536.84989: attempt loop complete, returning result 28011 1726882536.85008: variable 'item' from source: unknown 28011 1726882536.85067: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.004030", "end": "2024-09-20 21:35:36.809049", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-20 21:35:36.805019" } 28011 1726882536.85182: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882536.85185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882536.85187: variable 'omit' from source: magic vars 28011 1726882536.85284: variable 'ansible_distribution_major_version' from source: facts 28011 1726882536.85287: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882536.85410: variable 'type' from source: set_fact 28011 1726882536.85413: variable 'state' from source: include params 28011 1726882536.85416: variable 'interface' from source: set_fact 28011 1726882536.85423: variable 'current_interfaces' from source: set_fact 28011 1726882536.85425: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 28011 1726882536.85430: variable 'omit' from source: magic vars 28011 1726882536.85441: variable 'omit' from source: magic vars 28011 1726882536.85466: variable 'item' from source: unknown 28011 1726882536.85514: variable 'item' from source: unknown 28011 1726882536.85530: variable 'omit' from source: magic vars 28011 1726882536.85543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882536.85550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882536.85557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882536.85566: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882536.85569: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882536.85571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882536.85621: Set connection var ansible_connection to ssh 28011 1726882536.85626: Set connection var ansible_pipelining to False 28011 1726882536.85632: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882536.85640: Set connection var ansible_shell_executable to /bin/sh 28011 1726882536.85645: Set connection var ansible_timeout to 10 28011 1726882536.85648: Set connection var ansible_shell_type to sh 28011 1726882536.85662: variable 'ansible_shell_executable' from source: unknown 28011 1726882536.85665: variable 'ansible_connection' from source: unknown 28011 1726882536.85667: variable 'ansible_module_compression' from source: unknown 28011 1726882536.85669: variable 'ansible_shell_type' from source: unknown 28011 1726882536.85671: variable 'ansible_shell_executable' from source: unknown 28011 1726882536.85673: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882536.85678: variable 'ansible_pipelining' from source: unknown 28011 1726882536.85680: variable 'ansible_timeout' from source: unknown 28011 1726882536.85685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882536.85751: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882536.85757: variable 'omit' from source: magic vars 28011 1726882536.85760: starting attempt loop 28011 1726882536.85763: running the handler 28011 1726882536.85769: _low_level_execute_command(): starting 28011 1726882536.85772: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882536.86204: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.86207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.86209: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.86220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.86268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882536.86271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.86275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.86322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.87876: stdout chunk (state=3): >>>/root <<< 28011 1726882536.87975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.88003: stderr chunk (state=3): >>><<< 28011 1726882536.88006: stdout chunk (state=3): >>><<< 28011 1726882536.88018: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882536.88025: _low_level_execute_command(): starting 28011 1726882536.88029: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520 `" && echo ansible-tmp-1726882536.8801703-28326-66110637554520="` echo /root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520 `" ) && sleep 0' 28011 1726882536.88427: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.88430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.88433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.88435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.88474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.88478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.88527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.90383: stdout chunk (state=3): >>>ansible-tmp-1726882536.8801703-28326-66110637554520=/root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520 <<< 28011 1726882536.90562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.90565: stdout chunk (state=3): >>><<< 28011 1726882536.90567: stderr chunk (state=3): >>><<< 28011 1726882536.90582: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882536.8801703-28326-66110637554520=/root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882536.90798: variable 'ansible_module_compression' from source: unknown 28011 1726882536.90802: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28011 1726882536.90804: variable 'ansible_facts' from source: unknown 28011 1726882536.90806: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520/AnsiballZ_command.py 28011 1726882536.90944: Sending initial data 28011 1726882536.90947: Sent initial data (155 bytes) 28011 1726882536.91676: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882536.91706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882536.91722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882536.91814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.91841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882536.91858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882536.91879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.92031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.93532: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882536.93650: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882536.93919: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpb6nwek6r /root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520/AnsiballZ_command.py <<< 28011 1726882536.93924: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpb6nwek6r" to remote "/root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520/AnsiballZ_command.py" <<< 28011 1726882536.95062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.95068: stderr chunk (state=3): >>><<< 28011 1726882536.95070: stdout chunk (state=3): >>><<< 28011 1726882536.95073: done transferring module to remote 28011 1726882536.95075: _low_level_execute_command(): starting 28011 1726882536.95077: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520/ /root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520/AnsiballZ_command.py && sleep 0' 28011 1726882536.95735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.95843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.95920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882536.97673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882536.97676: stdout chunk (state=3): >>><<< 28011 1726882536.97678: stderr chunk (state=3): >>><<< 28011 1726882536.97850: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882536.97853: _low_level_execute_command(): starting 28011 1726882536.97856: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520/AnsiballZ_command.py && sleep 0' 28011 1726882536.98519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882536.98582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882536.98646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882536.98678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882537.13973: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 21:35:37.134251", "end": "2024-09-20 21:35:37.137916", "delta": "0:00:00.003665", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882537.15605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882537.15608: stdout chunk (state=3): >>><<< 28011 1726882537.15610: stderr chunk (state=3): >>><<< 28011 1726882537.15612: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 21:35:37.134251", "end": "2024-09-20 21:35:37.137916", "delta": "0:00:00.003665", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882537.15614: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882537.15617: _low_level_execute_command(): starting 28011 1726882537.15619: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882536.8801703-28326-66110637554520/ > /dev/null 2>&1 && sleep 0' 28011 1726882537.16237: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882537.16285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.16306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882537.16374: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.16418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882537.16439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882537.16463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882537.16545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882537.18414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882537.18498: stdout chunk (state=3): >>><<< 28011 1726882537.18502: stderr chunk (state=3): >>><<< 28011 1726882537.18504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882537.18506: handler run complete 28011 1726882537.18508: Evaluated conditional (False): False 28011 1726882537.18510: attempt loop complete, returning result 28011 1726882537.18533: variable 'item' from source: unknown 28011 1726882537.18798: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.003665", "end": "2024-09-20 21:35:37.137916", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-20 21:35:37.134251" } 28011 1726882537.18899: dumping result to json 28011 1726882537.18901: done dumping result, returning 28011 1726882537.18904: done running TaskExecutor() for managed_node1/TASK: Create veth interface ethtest0 [12673a56-9f93-962d-7c65-00000000026e] 28011 1726882537.18906: sending task result for task 12673a56-9f93-962d-7c65-00000000026e 28011 1726882537.19579: done sending task result for task 12673a56-9f93-962d-7c65-00000000026e 28011 1726882537.19586: WORKER PROCESS EXITING 28011 1726882537.19722: no more pending results, returning what we have 28011 1726882537.19725: results queue empty 28011 1726882537.19726: checking for any_errors_fatal 28011 1726882537.19729: done checking for any_errors_fatal 28011 1726882537.19730: checking for max_fail_percentage 28011 1726882537.19731: done checking for max_fail_percentage 28011 1726882537.19732: checking to see if all hosts have failed and the running result is not ok 28011 1726882537.19733: done checking to see if all hosts have failed 28011 1726882537.19733: getting the remaining hosts for this loop 28011 1726882537.19735: done getting the remaining hosts for this loop 28011 1726882537.19738: getting the next task for host managed_node1 28011 1726882537.19742: done getting next task for host managed_node1 28011 1726882537.19744: ^ task is: TASK: Set up veth as managed by NetworkManager 28011 1726882537.19746: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882537.19749: getting variables 28011 1726882537.19751: in VariableManager get_vars() 28011 1726882537.19780: Calling all_inventory to load vars for managed_node1 28011 1726882537.19783: Calling groups_inventory to load vars for managed_node1 28011 1726882537.19785: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882537.19798: Calling all_plugins_play to load vars for managed_node1 28011 1726882537.19801: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882537.19804: Calling groups_plugins_play to load vars for managed_node1 28011 1726882537.19958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882537.20164: done with get_vars() 28011 1726882537.20174: done getting variables 28011 1726882537.20241: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:35:37 -0400 (0:00:01.047) 0:00:06.753 ****** 28011 1726882537.20267: entering _queue_task() for managed_node1/command 28011 1726882537.20703: worker is 1 (out of 1 available) 28011 1726882537.20715: exiting _queue_task() for managed_node1/command 28011 1726882537.20728: done queuing things up, now waiting for results queue to drain 28011 1726882537.20730: waiting for pending results... 28011 1726882537.20969: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 28011 1726882537.20985: in run() - task 12673a56-9f93-962d-7c65-00000000026f 28011 1726882537.21012: variable 'ansible_search_path' from source: unknown 28011 1726882537.21024: variable 'ansible_search_path' from source: unknown 28011 1726882537.21072: calling self._execute() 28011 1726882537.21177: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.21197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.21214: variable 'omit' from source: magic vars 28011 1726882537.21618: variable 'ansible_distribution_major_version' from source: facts 28011 1726882537.21638: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882537.21807: variable 'type' from source: set_fact 28011 1726882537.21825: variable 'state' from source: include params 28011 1726882537.21998: Evaluated conditional (type == 'veth' and state == 'present'): True 28011 1726882537.22001: variable 'omit' from source: magic vars 28011 1726882537.22003: variable 'omit' from source: magic vars 28011 1726882537.22005: variable 'interface' from source: set_fact 28011 1726882537.22007: variable 'omit' from source: magic vars 28011 1726882537.22054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882537.22098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882537.22133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882537.22155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882537.22171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882537.22211: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882537.22220: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.22235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.22343: Set connection var ansible_connection to ssh 28011 1726882537.22358: Set connection var ansible_pipelining to False 28011 1726882537.22370: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882537.22388: Set connection var ansible_shell_executable to /bin/sh 28011 1726882537.22408: Set connection var ansible_timeout to 10 28011 1726882537.22421: Set connection var ansible_shell_type to sh 28011 1726882537.22560: variable 'ansible_shell_executable' from source: unknown 28011 1726882537.22563: variable 'ansible_connection' from source: unknown 28011 1726882537.22566: variable 'ansible_module_compression' from source: unknown 28011 1726882537.22568: variable 'ansible_shell_type' from source: unknown 28011 1726882537.22570: variable 'ansible_shell_executable' from source: unknown 28011 1726882537.22572: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.22574: variable 'ansible_pipelining' from source: unknown 28011 1726882537.22576: variable 'ansible_timeout' from source: unknown 28011 1726882537.22578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.22681: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882537.22703: variable 'omit' from source: magic vars 28011 1726882537.22713: starting attempt loop 28011 1726882537.22720: running the handler 28011 1726882537.22742: _low_level_execute_command(): starting 28011 1726882537.22756: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882537.23501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882537.23514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882537.23551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.23562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882537.23656: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882537.23680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882537.23757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882537.25691: stdout chunk (state=3): >>>/root <<< 28011 1726882537.25696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882537.25699: stdout chunk (state=3): >>><<< 28011 1726882537.25701: stderr chunk (state=3): >>><<< 28011 1726882537.25706: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882537.25709: _low_level_execute_command(): starting 28011 1726882537.25712: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375 `" && echo ansible-tmp-1726882537.2560728-28386-191215823790375="` echo /root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375 `" ) && sleep 0' 28011 1726882537.26219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882537.26234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882537.26249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882537.26276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882537.26298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882537.26311: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882537.26407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882537.26422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882537.26437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882537.26511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882537.28364: stdout chunk (state=3): >>>ansible-tmp-1726882537.2560728-28386-191215823790375=/root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375 <<< 28011 1726882537.28497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882537.28514: stdout chunk (state=3): >>><<< 28011 1726882537.28524: stderr chunk (state=3): >>><<< 28011 1726882537.28543: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882537.2560728-28386-191215823790375=/root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882537.28698: variable 'ansible_module_compression' from source: unknown 28011 1726882537.28701: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28011 1726882537.28704: variable 'ansible_facts' from source: unknown 28011 1726882537.28757: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375/AnsiballZ_command.py 28011 1726882537.28952: Sending initial data 28011 1726882537.28955: Sent initial data (156 bytes) 28011 1726882537.29551: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882537.29565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882537.29601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882537.29707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882537.29722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882537.29742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882537.29921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882537.31364: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882537.31481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882537.31521: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpujk37x2a /root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375/AnsiballZ_command.py <<< 28011 1726882537.31524: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375/AnsiballZ_command.py" <<< 28011 1726882537.31598: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpujk37x2a" to remote "/root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375/AnsiballZ_command.py" <<< 28011 1726882537.32588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882537.32600: stdout chunk (state=3): >>><<< 28011 1726882537.32612: stderr chunk (state=3): >>><<< 28011 1726882537.32649: done transferring module to remote 28011 1726882537.32663: _low_level_execute_command(): starting 28011 1726882537.32672: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375/ /root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375/AnsiballZ_command.py && sleep 0' 28011 1726882537.33277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.33332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882537.33366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882537.33441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882537.35163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882537.35167: stdout chunk (state=3): >>><<< 28011 1726882537.35170: stderr chunk (state=3): >>><<< 28011 1726882537.35181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882537.35184: _low_level_execute_command(): starting 28011 1726882537.35192: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375/AnsiballZ_command.py && sleep 0' 28011 1726882537.35582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882537.35586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.35609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.35662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882537.35674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882537.35747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882537.52618: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 21:35:37.506897", "end": "2024-09-20 21:35:37.524783", "delta": "0:00:00.017886", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882537.54100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882537.54126: stderr chunk (state=3): >>><<< 28011 1726882537.54130: stdout chunk (state=3): >>><<< 28011 1726882537.54148: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 21:35:37.506897", "end": "2024-09-20 21:35:37.524783", "delta": "0:00:00.017886", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882537.54183: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882537.54191: _low_level_execute_command(): starting 28011 1726882537.54200: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882537.2560728-28386-191215823790375/ > /dev/null 2>&1 && sleep 0' 28011 1726882537.54646: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882537.54649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882537.54651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.54653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882537.54660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.54712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882537.54720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882537.54759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882537.56599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882537.56626: stderr chunk (state=3): >>><<< 28011 1726882537.56629: stdout chunk (state=3): >>><<< 28011 1726882537.56641: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882537.56651: handler run complete 28011 1726882537.56668: Evaluated conditional (False): False 28011 1726882537.56676: attempt loop complete, returning result 28011 1726882537.56678: _execute() done 28011 1726882537.56681: dumping result to json 28011 1726882537.56686: done dumping result, returning 28011 1726882537.56696: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [12673a56-9f93-962d-7c65-00000000026f] 28011 1726882537.56701: sending task result for task 12673a56-9f93-962d-7c65-00000000026f 28011 1726882537.56798: done sending task result for task 12673a56-9f93-962d-7c65-00000000026f 28011 1726882537.56801: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.017886", "end": "2024-09-20 21:35:37.524783", "rc": 0, "start": "2024-09-20 21:35:37.506897" } 28011 1726882537.56860: no more pending results, returning what we have 28011 1726882537.56863: results queue empty 28011 1726882537.56864: checking for any_errors_fatal 28011 1726882537.56877: done checking for any_errors_fatal 28011 1726882537.56878: checking for max_fail_percentage 28011 1726882537.56880: done checking for max_fail_percentage 28011 1726882537.56881: checking to see if all hosts have failed and the running result is not ok 28011 1726882537.56882: done checking to see if all hosts have failed 28011 1726882537.56882: getting the remaining hosts for this loop 28011 1726882537.56884: done getting the remaining hosts for this loop 28011 1726882537.56887: getting the next task for host managed_node1 28011 1726882537.56897: done getting next task for host managed_node1 28011 1726882537.56899: ^ task is: TASK: Delete veth interface {{ interface }} 28011 1726882537.56902: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882537.56906: getting variables 28011 1726882537.56908: in VariableManager get_vars() 28011 1726882537.56948: Calling all_inventory to load vars for managed_node1 28011 1726882537.56950: Calling groups_inventory to load vars for managed_node1 28011 1726882537.56952: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882537.56962: Calling all_plugins_play to load vars for managed_node1 28011 1726882537.56964: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882537.56966: Calling groups_plugins_play to load vars for managed_node1 28011 1726882537.57113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882537.57259: done with get_vars() 28011 1726882537.57266: done getting variables 28011 1726882537.57310: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882537.57398: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:35:37 -0400 (0:00:00.371) 0:00:07.125 ****** 28011 1726882537.57420: entering _queue_task() for managed_node1/command 28011 1726882537.57617: worker is 1 (out of 1 available) 28011 1726882537.57630: exiting _queue_task() for managed_node1/command 28011 1726882537.57643: done queuing things up, now waiting for results queue to drain 28011 1726882537.57645: waiting for pending results... 28011 1726882537.57796: running TaskExecutor() for managed_node1/TASK: Delete veth interface ethtest0 28011 1726882537.57854: in run() - task 12673a56-9f93-962d-7c65-000000000270 28011 1726882537.57866: variable 'ansible_search_path' from source: unknown 28011 1726882537.57870: variable 'ansible_search_path' from source: unknown 28011 1726882537.57900: calling self._execute() 28011 1726882537.57962: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.57966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.57977: variable 'omit' from source: magic vars 28011 1726882537.58236: variable 'ansible_distribution_major_version' from source: facts 28011 1726882537.58245: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882537.58397: variable 'type' from source: set_fact 28011 1726882537.58401: variable 'state' from source: include params 28011 1726882537.58404: variable 'interface' from source: set_fact 28011 1726882537.58407: variable 'current_interfaces' from source: set_fact 28011 1726882537.58415: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 28011 1726882537.58419: when evaluation is False, skipping this task 28011 1726882537.58422: _execute() done 28011 1726882537.58424: dumping result to json 28011 1726882537.58429: done dumping result, returning 28011 1726882537.58431: done running TaskExecutor() for managed_node1/TASK: Delete veth interface ethtest0 [12673a56-9f93-962d-7c65-000000000270] 28011 1726882537.58441: sending task result for task 12673a56-9f93-962d-7c65-000000000270 28011 1726882537.58517: done sending task result for task 12673a56-9f93-962d-7c65-000000000270 28011 1726882537.58520: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 28011 1726882537.58579: no more pending results, returning what we have 28011 1726882537.58582: results queue empty 28011 1726882537.58583: checking for any_errors_fatal 28011 1726882537.58592: done checking for any_errors_fatal 28011 1726882537.58594: checking for max_fail_percentage 28011 1726882537.58596: done checking for max_fail_percentage 28011 1726882537.58597: checking to see if all hosts have failed and the running result is not ok 28011 1726882537.58597: done checking to see if all hosts have failed 28011 1726882537.58598: getting the remaining hosts for this loop 28011 1726882537.58599: done getting the remaining hosts for this loop 28011 1726882537.58602: getting the next task for host managed_node1 28011 1726882537.58606: done getting next task for host managed_node1 28011 1726882537.58608: ^ task is: TASK: Create dummy interface {{ interface }} 28011 1726882537.58611: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882537.58613: getting variables 28011 1726882537.58615: in VariableManager get_vars() 28011 1726882537.58645: Calling all_inventory to load vars for managed_node1 28011 1726882537.58648: Calling groups_inventory to load vars for managed_node1 28011 1726882537.58649: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882537.58656: Calling all_plugins_play to load vars for managed_node1 28011 1726882537.58657: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882537.58659: Calling groups_plugins_play to load vars for managed_node1 28011 1726882537.58766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882537.58885: done with get_vars() 28011 1726882537.58895: done getting variables 28011 1726882537.58934: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882537.59014: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:35:37 -0400 (0:00:00.016) 0:00:07.141 ****** 28011 1726882537.59034: entering _queue_task() for managed_node1/command 28011 1726882537.59206: worker is 1 (out of 1 available) 28011 1726882537.59219: exiting _queue_task() for managed_node1/command 28011 1726882537.59230: done queuing things up, now waiting for results queue to drain 28011 1726882537.59232: waiting for pending results... 28011 1726882537.59362: running TaskExecutor() for managed_node1/TASK: Create dummy interface ethtest0 28011 1726882537.59425: in run() - task 12673a56-9f93-962d-7c65-000000000271 28011 1726882537.59437: variable 'ansible_search_path' from source: unknown 28011 1726882537.59440: variable 'ansible_search_path' from source: unknown 28011 1726882537.59473: calling self._execute() 28011 1726882537.59540: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.59543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.59552: variable 'omit' from source: magic vars 28011 1726882537.59998: variable 'ansible_distribution_major_version' from source: facts 28011 1726882537.60002: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882537.60074: variable 'type' from source: set_fact 28011 1726882537.60084: variable 'state' from source: include params 28011 1726882537.60099: variable 'interface' from source: set_fact 28011 1726882537.60108: variable 'current_interfaces' from source: set_fact 28011 1726882537.60120: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 28011 1726882537.60127: when evaluation is False, skipping this task 28011 1726882537.60133: _execute() done 28011 1726882537.60139: dumping result to json 28011 1726882537.60146: done dumping result, returning 28011 1726882537.60155: done running TaskExecutor() for managed_node1/TASK: Create dummy interface ethtest0 [12673a56-9f93-962d-7c65-000000000271] 28011 1726882537.60163: sending task result for task 12673a56-9f93-962d-7c65-000000000271 28011 1726882537.60257: done sending task result for task 12673a56-9f93-962d-7c65-000000000271 28011 1726882537.60264: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 28011 1726882537.60332: no more pending results, returning what we have 28011 1726882537.60335: results queue empty 28011 1726882537.60336: checking for any_errors_fatal 28011 1726882537.60341: done checking for any_errors_fatal 28011 1726882537.60342: checking for max_fail_percentage 28011 1726882537.60344: done checking for max_fail_percentage 28011 1726882537.60345: checking to see if all hosts have failed and the running result is not ok 28011 1726882537.60346: done checking to see if all hosts have failed 28011 1726882537.60346: getting the remaining hosts for this loop 28011 1726882537.60348: done getting the remaining hosts for this loop 28011 1726882537.60351: getting the next task for host managed_node1 28011 1726882537.60356: done getting next task for host managed_node1 28011 1726882537.60358: ^ task is: TASK: Delete dummy interface {{ interface }} 28011 1726882537.60362: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882537.60365: getting variables 28011 1726882537.60367: in VariableManager get_vars() 28011 1726882537.60468: Calling all_inventory to load vars for managed_node1 28011 1726882537.60471: Calling groups_inventory to load vars for managed_node1 28011 1726882537.60473: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882537.60481: Calling all_plugins_play to load vars for managed_node1 28011 1726882537.60484: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882537.60486: Calling groups_plugins_play to load vars for managed_node1 28011 1726882537.60714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882537.60895: done with get_vars() 28011 1726882537.60905: done getting variables 28011 1726882537.60949: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882537.61020: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:35:37 -0400 (0:00:00.020) 0:00:07.161 ****** 28011 1726882537.61046: entering _queue_task() for managed_node1/command 28011 1726882537.61207: worker is 1 (out of 1 available) 28011 1726882537.61220: exiting _queue_task() for managed_node1/command 28011 1726882537.61230: done queuing things up, now waiting for results queue to drain 28011 1726882537.61232: waiting for pending results... 28011 1726882537.61372: running TaskExecutor() for managed_node1/TASK: Delete dummy interface ethtest0 28011 1726882537.61434: in run() - task 12673a56-9f93-962d-7c65-000000000272 28011 1726882537.61443: variable 'ansible_search_path' from source: unknown 28011 1726882537.61446: variable 'ansible_search_path' from source: unknown 28011 1726882537.61474: calling self._execute() 28011 1726882537.61537: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.61540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.61548: variable 'omit' from source: magic vars 28011 1726882537.61790: variable 'ansible_distribution_major_version' from source: facts 28011 1726882537.61801: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882537.61923: variable 'type' from source: set_fact 28011 1726882537.61927: variable 'state' from source: include params 28011 1726882537.61930: variable 'interface' from source: set_fact 28011 1726882537.61933: variable 'current_interfaces' from source: set_fact 28011 1726882537.61940: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 28011 1726882537.61943: when evaluation is False, skipping this task 28011 1726882537.61946: _execute() done 28011 1726882537.61948: dumping result to json 28011 1726882537.61951: done dumping result, returning 28011 1726882537.61956: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface ethtest0 [12673a56-9f93-962d-7c65-000000000272] 28011 1726882537.61961: sending task result for task 12673a56-9f93-962d-7c65-000000000272 28011 1726882537.62036: done sending task result for task 12673a56-9f93-962d-7c65-000000000272 28011 1726882537.62039: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 28011 1726882537.62080: no more pending results, returning what we have 28011 1726882537.62083: results queue empty 28011 1726882537.62084: checking for any_errors_fatal 28011 1726882537.62089: done checking for any_errors_fatal 28011 1726882537.62090: checking for max_fail_percentage 28011 1726882537.62091: done checking for max_fail_percentage 28011 1726882537.62092: checking to see if all hosts have failed and the running result is not ok 28011 1726882537.62094: done checking to see if all hosts have failed 28011 1726882537.62095: getting the remaining hosts for this loop 28011 1726882537.62096: done getting the remaining hosts for this loop 28011 1726882537.62099: getting the next task for host managed_node1 28011 1726882537.62103: done getting next task for host managed_node1 28011 1726882537.62105: ^ task is: TASK: Create tap interface {{ interface }} 28011 1726882537.62108: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882537.62111: getting variables 28011 1726882537.62112: in VariableManager get_vars() 28011 1726882537.62141: Calling all_inventory to load vars for managed_node1 28011 1726882537.62144: Calling groups_inventory to load vars for managed_node1 28011 1726882537.62146: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882537.62153: Calling all_plugins_play to load vars for managed_node1 28011 1726882537.62156: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882537.62158: Calling groups_plugins_play to load vars for managed_node1 28011 1726882537.62261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882537.62381: done with get_vars() 28011 1726882537.62388: done getting variables 28011 1726882537.62427: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882537.62499: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:35:37 -0400 (0:00:00.014) 0:00:07.176 ****** 28011 1726882537.62517: entering _queue_task() for managed_node1/command 28011 1726882537.62686: worker is 1 (out of 1 available) 28011 1726882537.62699: exiting _queue_task() for managed_node1/command 28011 1726882537.62711: done queuing things up, now waiting for results queue to drain 28011 1726882537.62712: waiting for pending results... 28011 1726882537.63150: running TaskExecutor() for managed_node1/TASK: Create tap interface ethtest0 28011 1726882537.63187: in run() - task 12673a56-9f93-962d-7c65-000000000273 28011 1726882537.63249: variable 'ansible_search_path' from source: unknown 28011 1726882537.63252: variable 'ansible_search_path' from source: unknown 28011 1726882537.63255: calling self._execute() 28011 1726882537.63332: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.63344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.63363: variable 'omit' from source: magic vars 28011 1726882537.63844: variable 'ansible_distribution_major_version' from source: facts 28011 1726882537.63862: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882537.64097: variable 'type' from source: set_fact 28011 1726882537.64109: variable 'state' from source: include params 28011 1726882537.64133: variable 'interface' from source: set_fact 28011 1726882537.64136: variable 'current_interfaces' from source: set_fact 28011 1726882537.64142: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 28011 1726882537.64197: when evaluation is False, skipping this task 28011 1726882537.64200: _execute() done 28011 1726882537.64202: dumping result to json 28011 1726882537.64204: done dumping result, returning 28011 1726882537.64207: done running TaskExecutor() for managed_node1/TASK: Create tap interface ethtest0 [12673a56-9f93-962d-7c65-000000000273] 28011 1726882537.64209: sending task result for task 12673a56-9f93-962d-7c65-000000000273 28011 1726882537.64401: done sending task result for task 12673a56-9f93-962d-7c65-000000000273 28011 1726882537.64405: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 28011 1726882537.64443: no more pending results, returning what we have 28011 1726882537.64446: results queue empty 28011 1726882537.64446: checking for any_errors_fatal 28011 1726882537.64451: done checking for any_errors_fatal 28011 1726882537.64452: checking for max_fail_percentage 28011 1726882537.64453: done checking for max_fail_percentage 28011 1726882537.64454: checking to see if all hosts have failed and the running result is not ok 28011 1726882537.64455: done checking to see if all hosts have failed 28011 1726882537.64456: getting the remaining hosts for this loop 28011 1726882537.64457: done getting the remaining hosts for this loop 28011 1726882537.64460: getting the next task for host managed_node1 28011 1726882537.64465: done getting next task for host managed_node1 28011 1726882537.64468: ^ task is: TASK: Delete tap interface {{ interface }} 28011 1726882537.64470: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882537.64474: getting variables 28011 1726882537.64475: in VariableManager get_vars() 28011 1726882537.64569: Calling all_inventory to load vars for managed_node1 28011 1726882537.64577: Calling groups_inventory to load vars for managed_node1 28011 1726882537.64581: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882537.64597: Calling all_plugins_play to load vars for managed_node1 28011 1726882537.64600: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882537.64610: Calling groups_plugins_play to load vars for managed_node1 28011 1726882537.64862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882537.64979: done with get_vars() 28011 1726882537.64986: done getting variables 28011 1726882537.65025: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882537.65095: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:35:37 -0400 (0:00:00.025) 0:00:07.202 ****** 28011 1726882537.65114: entering _queue_task() for managed_node1/command 28011 1726882537.65275: worker is 1 (out of 1 available) 28011 1726882537.65289: exiting _queue_task() for managed_node1/command 28011 1726882537.65303: done queuing things up, now waiting for results queue to drain 28011 1726882537.65304: waiting for pending results... 28011 1726882537.65450: running TaskExecutor() for managed_node1/TASK: Delete tap interface ethtest0 28011 1726882537.65513: in run() - task 12673a56-9f93-962d-7c65-000000000274 28011 1726882537.65524: variable 'ansible_search_path' from source: unknown 28011 1726882537.65527: variable 'ansible_search_path' from source: unknown 28011 1726882537.65555: calling self._execute() 28011 1726882537.65614: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.65619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.65627: variable 'omit' from source: magic vars 28011 1726882537.65870: variable 'ansible_distribution_major_version' from source: facts 28011 1726882537.65879: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882537.66008: variable 'type' from source: set_fact 28011 1726882537.66011: variable 'state' from source: include params 28011 1726882537.66014: variable 'interface' from source: set_fact 28011 1726882537.66019: variable 'current_interfaces' from source: set_fact 28011 1726882537.66029: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 28011 1726882537.66032: when evaluation is False, skipping this task 28011 1726882537.66034: _execute() done 28011 1726882537.66036: dumping result to json 28011 1726882537.66039: done dumping result, returning 28011 1726882537.66044: done running TaskExecutor() for managed_node1/TASK: Delete tap interface ethtest0 [12673a56-9f93-962d-7c65-000000000274] 28011 1726882537.66049: sending task result for task 12673a56-9f93-962d-7c65-000000000274 28011 1726882537.66122: done sending task result for task 12673a56-9f93-962d-7c65-000000000274 28011 1726882537.66125: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 28011 1726882537.66170: no more pending results, returning what we have 28011 1726882537.66173: results queue empty 28011 1726882537.66174: checking for any_errors_fatal 28011 1726882537.66178: done checking for any_errors_fatal 28011 1726882537.66179: checking for max_fail_percentage 28011 1726882537.66181: done checking for max_fail_percentage 28011 1726882537.66181: checking to see if all hosts have failed and the running result is not ok 28011 1726882537.66182: done checking to see if all hosts have failed 28011 1726882537.66183: getting the remaining hosts for this loop 28011 1726882537.66184: done getting the remaining hosts for this loop 28011 1726882537.66187: getting the next task for host managed_node1 28011 1726882537.66192: done getting next task for host managed_node1 28011 1726882537.66197: ^ task is: TASK: Include the task 'assert_device_present.yml' 28011 1726882537.66199: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882537.66202: getting variables 28011 1726882537.66203: in VariableManager get_vars() 28011 1726882537.66235: Calling all_inventory to load vars for managed_node1 28011 1726882537.66237: Calling groups_inventory to load vars for managed_node1 28011 1726882537.66239: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882537.66247: Calling all_plugins_play to load vars for managed_node1 28011 1726882537.66249: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882537.66252: Calling groups_plugins_play to load vars for managed_node1 28011 1726882537.66354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882537.66515: done with get_vars() 28011 1726882537.66524: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:21 Friday 20 September 2024 21:35:37 -0400 (0:00:00.014) 0:00:07.217 ****** 28011 1726882537.66604: entering _queue_task() for managed_node1/include_tasks 28011 1726882537.66806: worker is 1 (out of 1 available) 28011 1726882537.66819: exiting _queue_task() for managed_node1/include_tasks 28011 1726882537.66830: done queuing things up, now waiting for results queue to drain 28011 1726882537.66832: waiting for pending results... 28011 1726882537.67211: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 28011 1726882537.67215: in run() - task 12673a56-9f93-962d-7c65-00000000000e 28011 1726882537.67219: variable 'ansible_search_path' from source: unknown 28011 1726882537.67222: calling self._execute() 28011 1726882537.67269: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.67280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.67296: variable 'omit' from source: magic vars 28011 1726882537.67601: variable 'ansible_distribution_major_version' from source: facts 28011 1726882537.67611: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882537.67618: _execute() done 28011 1726882537.67621: dumping result to json 28011 1726882537.67623: done dumping result, returning 28011 1726882537.67637: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [12673a56-9f93-962d-7c65-00000000000e] 28011 1726882537.67643: sending task result for task 12673a56-9f93-962d-7c65-00000000000e 28011 1726882537.67715: done sending task result for task 12673a56-9f93-962d-7c65-00000000000e 28011 1726882537.67718: WORKER PROCESS EXITING 28011 1726882537.67768: no more pending results, returning what we have 28011 1726882537.67772: in VariableManager get_vars() 28011 1726882537.67810: Calling all_inventory to load vars for managed_node1 28011 1726882537.67813: Calling groups_inventory to load vars for managed_node1 28011 1726882537.67815: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882537.67822: Calling all_plugins_play to load vars for managed_node1 28011 1726882537.67824: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882537.67826: Calling groups_plugins_play to load vars for managed_node1 28011 1726882537.67969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882537.68079: done with get_vars() 28011 1726882537.68084: variable 'ansible_search_path' from source: unknown 28011 1726882537.68096: we have included files to process 28011 1726882537.68097: generating all_blocks data 28011 1726882537.68098: done generating all_blocks data 28011 1726882537.68100: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28011 1726882537.68101: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28011 1726882537.68102: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28011 1726882537.68202: in VariableManager get_vars() 28011 1726882537.68215: done with get_vars() 28011 1726882537.68279: done processing included file 28011 1726882537.68281: iterating over new_blocks loaded from include file 28011 1726882537.68284: in VariableManager get_vars() 28011 1726882537.68300: done with get_vars() 28011 1726882537.68301: filtering new block on tags 28011 1726882537.68312: done filtering new block on tags 28011 1726882537.68314: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 28011 1726882537.68316: extending task lists for all hosts with included blocks 28011 1726882537.69877: done extending task lists 28011 1726882537.69879: done processing included files 28011 1726882537.69879: results queue empty 28011 1726882537.69880: checking for any_errors_fatal 28011 1726882537.69888: done checking for any_errors_fatal 28011 1726882537.69913: checking for max_fail_percentage 28011 1726882537.69915: done checking for max_fail_percentage 28011 1726882537.69916: checking to see if all hosts have failed and the running result is not ok 28011 1726882537.69916: done checking to see if all hosts have failed 28011 1726882537.69917: getting the remaining hosts for this loop 28011 1726882537.69918: done getting the remaining hosts for this loop 28011 1726882537.69921: getting the next task for host managed_node1 28011 1726882537.69931: done getting next task for host managed_node1 28011 1726882537.69933: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28011 1726882537.69936: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882537.69938: getting variables 28011 1726882537.69939: in VariableManager get_vars() 28011 1726882537.69953: Calling all_inventory to load vars for managed_node1 28011 1726882537.69955: Calling groups_inventory to load vars for managed_node1 28011 1726882537.69957: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882537.69962: Calling all_plugins_play to load vars for managed_node1 28011 1726882537.69965: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882537.69968: Calling groups_plugins_play to load vars for managed_node1 28011 1726882537.70117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882537.70339: done with get_vars() 28011 1726882537.70348: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:35:37 -0400 (0:00:00.038) 0:00:07.255 ****** 28011 1726882537.70438: entering _queue_task() for managed_node1/include_tasks 28011 1726882537.70690: worker is 1 (out of 1 available) 28011 1726882537.70703: exiting _queue_task() for managed_node1/include_tasks 28011 1726882537.70719: done queuing things up, now waiting for results queue to drain 28011 1726882537.70721: waiting for pending results... 28011 1726882537.70879: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 28011 1726882537.70946: in run() - task 12673a56-9f93-962d-7c65-0000000003e0 28011 1726882537.70954: variable 'ansible_search_path' from source: unknown 28011 1726882537.70958: variable 'ansible_search_path' from source: unknown 28011 1726882537.70985: calling self._execute() 28011 1726882537.71050: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.71053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.71062: variable 'omit' from source: magic vars 28011 1726882537.71325: variable 'ansible_distribution_major_version' from source: facts 28011 1726882537.71334: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882537.71339: _execute() done 28011 1726882537.71343: dumping result to json 28011 1726882537.71345: done dumping result, returning 28011 1726882537.71352: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-962d-7c65-0000000003e0] 28011 1726882537.71362: sending task result for task 12673a56-9f93-962d-7c65-0000000003e0 28011 1726882537.71440: done sending task result for task 12673a56-9f93-962d-7c65-0000000003e0 28011 1726882537.71443: WORKER PROCESS EXITING 28011 1726882537.71468: no more pending results, returning what we have 28011 1726882537.71472: in VariableManager get_vars() 28011 1726882537.71523: Calling all_inventory to load vars for managed_node1 28011 1726882537.71526: Calling groups_inventory to load vars for managed_node1 28011 1726882537.71529: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882537.71543: Calling all_plugins_play to load vars for managed_node1 28011 1726882537.71546: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882537.71550: Calling groups_plugins_play to load vars for managed_node1 28011 1726882537.71716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882537.71899: done with get_vars() 28011 1726882537.71906: variable 'ansible_search_path' from source: unknown 28011 1726882537.71907: variable 'ansible_search_path' from source: unknown 28011 1726882537.71940: we have included files to process 28011 1726882537.71942: generating all_blocks data 28011 1726882537.71943: done generating all_blocks data 28011 1726882537.71945: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28011 1726882537.71946: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28011 1726882537.71948: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28011 1726882537.72161: done processing included file 28011 1726882537.72163: iterating over new_blocks loaded from include file 28011 1726882537.72165: in VariableManager get_vars() 28011 1726882537.72181: done with get_vars() 28011 1726882537.72183: filtering new block on tags 28011 1726882537.72200: done filtering new block on tags 28011 1726882537.72202: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 28011 1726882537.72206: extending task lists for all hosts with included blocks 28011 1726882537.72304: done extending task lists 28011 1726882537.72305: done processing included files 28011 1726882537.72306: results queue empty 28011 1726882537.72307: checking for any_errors_fatal 28011 1726882537.72310: done checking for any_errors_fatal 28011 1726882537.72311: checking for max_fail_percentage 28011 1726882537.72311: done checking for max_fail_percentage 28011 1726882537.72312: checking to see if all hosts have failed and the running result is not ok 28011 1726882537.72313: done checking to see if all hosts have failed 28011 1726882537.72321: getting the remaining hosts for this loop 28011 1726882537.72322: done getting the remaining hosts for this loop 28011 1726882537.72327: getting the next task for host managed_node1 28011 1726882537.72331: done getting next task for host managed_node1 28011 1726882537.72333: ^ task is: TASK: Get stat for interface {{ interface }} 28011 1726882537.72335: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882537.72338: getting variables 28011 1726882537.72339: in VariableManager get_vars() 28011 1726882537.72351: Calling all_inventory to load vars for managed_node1 28011 1726882537.72353: Calling groups_inventory to load vars for managed_node1 28011 1726882537.72355: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882537.72359: Calling all_plugins_play to load vars for managed_node1 28011 1726882537.72361: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882537.72364: Calling groups_plugins_play to load vars for managed_node1 28011 1726882537.72558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882537.72749: done with get_vars() 28011 1726882537.72758: done getting variables 28011 1726882537.72897: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:35:37 -0400 (0:00:00.024) 0:00:07.280 ****** 28011 1726882537.72923: entering _queue_task() for managed_node1/stat 28011 1726882537.73157: worker is 1 (out of 1 available) 28011 1726882537.73169: exiting _queue_task() for managed_node1/stat 28011 1726882537.73179: done queuing things up, now waiting for results queue to drain 28011 1726882537.73181: waiting for pending results... 28011 1726882537.73439: running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 28011 1726882537.73522: in run() - task 12673a56-9f93-962d-7c65-0000000004ff 28011 1726882537.73547: variable 'ansible_search_path' from source: unknown 28011 1726882537.73550: variable 'ansible_search_path' from source: unknown 28011 1726882537.73568: calling self._execute() 28011 1726882537.73709: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.73712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.73715: variable 'omit' from source: magic vars 28011 1726882537.74051: variable 'ansible_distribution_major_version' from source: facts 28011 1726882537.74068: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882537.74077: variable 'omit' from source: magic vars 28011 1726882537.74126: variable 'omit' from source: magic vars 28011 1726882537.74221: variable 'interface' from source: set_fact 28011 1726882537.74242: variable 'omit' from source: magic vars 28011 1726882537.74283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882537.74386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882537.74392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882537.74397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882537.74399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882537.74441: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882537.74451: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.74458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.74592: Set connection var ansible_connection to ssh 28011 1726882537.74602: Set connection var ansible_pipelining to False 28011 1726882537.74608: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882537.74617: Set connection var ansible_shell_executable to /bin/sh 28011 1726882537.74647: Set connection var ansible_timeout to 10 28011 1726882537.74654: Set connection var ansible_shell_type to sh 28011 1726882537.74671: variable 'ansible_shell_executable' from source: unknown 28011 1726882537.74675: variable 'ansible_connection' from source: unknown 28011 1726882537.74677: variable 'ansible_module_compression' from source: unknown 28011 1726882537.74680: variable 'ansible_shell_type' from source: unknown 28011 1726882537.74683: variable 'ansible_shell_executable' from source: unknown 28011 1726882537.74686: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882537.74688: variable 'ansible_pipelining' from source: unknown 28011 1726882537.74695: variable 'ansible_timeout' from source: unknown 28011 1726882537.74698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882537.74998: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882537.75002: variable 'omit' from source: magic vars 28011 1726882537.75005: starting attempt loop 28011 1726882537.75007: running the handler 28011 1726882537.75010: _low_level_execute_command(): starting 28011 1726882537.75012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882537.75569: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882537.75584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882537.75608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882537.75632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882537.75684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.75751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882537.75767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882537.75817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882537.76000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882537.77561: stdout chunk (state=3): >>>/root <<< 28011 1726882537.77703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882537.77707: stdout chunk (state=3): >>><<< 28011 1726882537.77725: stderr chunk (state=3): >>><<< 28011 1726882537.77828: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882537.77833: _low_level_execute_command(): starting 28011 1726882537.77835: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991 `" && echo ansible-tmp-1726882537.777432-28421-73084086707991="` echo /root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991 `" ) && sleep 0' 28011 1726882537.78353: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882537.78369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882537.78411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882537.78460: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.78533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882537.78579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882537.78624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882537.80494: stdout chunk (state=3): >>>ansible-tmp-1726882537.777432-28421-73084086707991=/root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991 <<< 28011 1726882537.80612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882537.80658: stderr chunk (state=3): >>><<< 28011 1726882537.80661: stdout chunk (state=3): >>><<< 28011 1726882537.80898: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882537.777432-28421-73084086707991=/root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882537.80902: variable 'ansible_module_compression' from source: unknown 28011 1726882537.80904: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28011 1726882537.80906: variable 'ansible_facts' from source: unknown 28011 1726882537.80919: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991/AnsiballZ_stat.py 28011 1726882537.81038: Sending initial data 28011 1726882537.81141: Sent initial data (151 bytes) 28011 1726882537.81653: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882537.81668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882537.81710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882537.81724: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28011 1726882537.81737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882537.81752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882537.81808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.81846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882537.81870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882537.81887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882537.81968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882537.83507: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882537.83571: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882537.83627: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp4f6v3trw /root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991/AnsiballZ_stat.py <<< 28011 1726882537.83668: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991/AnsiballZ_stat.py" <<< 28011 1726882537.83708: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp4f6v3trw" to remote "/root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991/AnsiballZ_stat.py" <<< 28011 1726882537.84267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882537.84321: stderr chunk (state=3): >>><<< 28011 1726882537.84324: stdout chunk (state=3): >>><<< 28011 1726882537.84331: done transferring module to remote 28011 1726882537.84340: _low_level_execute_command(): starting 28011 1726882537.84344: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991/ /root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991/AnsiballZ_stat.py && sleep 0' 28011 1726882537.84760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882537.84763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.84765: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882537.84767: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882537.84772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.84819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882537.84826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882537.84866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882537.86575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882537.86604: stderr chunk (state=3): >>><<< 28011 1726882537.86606: stdout chunk (state=3): >>><<< 28011 1726882537.86616: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882537.86630: _low_level_execute_command(): starting 28011 1726882537.86632: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991/AnsiballZ_stat.py && sleep 0' 28011 1726882537.87021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882537.87024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.87027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882537.87029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882537.87084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882537.87087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882537.87129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882538.02133: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30373, "dev": 23, "nlink": 1, "atime": 1726882536.4559839, "mtime": 1726882536.4559839, "ctime": 1726882536.4559839, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28011 1726882538.03331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882538.03352: stderr chunk (state=3): >>><<< 28011 1726882538.03355: stdout chunk (state=3): >>><<< 28011 1726882538.03374: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30373, "dev": 23, "nlink": 1, "atime": 1726882536.4559839, "mtime": 1726882536.4559839, "ctime": 1726882536.4559839, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882538.03416: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882538.03424: _low_level_execute_command(): starting 28011 1726882538.03430: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882537.777432-28421-73084086707991/ > /dev/null 2>&1 && sleep 0' 28011 1726882538.03856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882538.03859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882538.03861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882538.03863: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882538.03865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882538.03925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882538.03928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882538.03968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882538.05768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882538.05811: stderr chunk (state=3): >>><<< 28011 1726882538.05814: stdout chunk (state=3): >>><<< 28011 1726882538.05826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882538.05832: handler run complete 28011 1726882538.05860: attempt loop complete, returning result 28011 1726882538.05863: _execute() done 28011 1726882538.05865: dumping result to json 28011 1726882538.05871: done dumping result, returning 28011 1726882538.05879: done running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 [12673a56-9f93-962d-7c65-0000000004ff] 28011 1726882538.05883: sending task result for task 12673a56-9f93-962d-7c65-0000000004ff 28011 1726882538.06068: done sending task result for task 12673a56-9f93-962d-7c65-0000000004ff 28011 1726882538.06071: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882536.4559839, "block_size": 4096, "blocks": 0, "ctime": 1726882536.4559839, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30373, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1726882536.4559839, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 28011 1726882538.06244: no more pending results, returning what we have 28011 1726882538.06246: results queue empty 28011 1726882538.06247: checking for any_errors_fatal 28011 1726882538.06248: done checking for any_errors_fatal 28011 1726882538.06249: checking for max_fail_percentage 28011 1726882538.06250: done checking for max_fail_percentage 28011 1726882538.06250: checking to see if all hosts have failed and the running result is not ok 28011 1726882538.06251: done checking to see if all hosts have failed 28011 1726882538.06251: getting the remaining hosts for this loop 28011 1726882538.06252: done getting the remaining hosts for this loop 28011 1726882538.06254: getting the next task for host managed_node1 28011 1726882538.06259: done getting next task for host managed_node1 28011 1726882538.06261: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 28011 1726882538.06263: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882538.06266: getting variables 28011 1726882538.06267: in VariableManager get_vars() 28011 1726882538.06297: Calling all_inventory to load vars for managed_node1 28011 1726882538.06300: Calling groups_inventory to load vars for managed_node1 28011 1726882538.06301: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882538.06310: Calling all_plugins_play to load vars for managed_node1 28011 1726882538.06311: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882538.06313: Calling groups_plugins_play to load vars for managed_node1 28011 1726882538.06424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882538.06563: done with get_vars() 28011 1726882538.06569: done getting variables 28011 1726882538.06644: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 28011 1726882538.06729: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:35:38 -0400 (0:00:00.338) 0:00:07.618 ****** 28011 1726882538.06752: entering _queue_task() for managed_node1/assert 28011 1726882538.06757: Creating lock for assert 28011 1726882538.06960: worker is 1 (out of 1 available) 28011 1726882538.06974: exiting _queue_task() for managed_node1/assert 28011 1726882538.06986: done queuing things up, now waiting for results queue to drain 28011 1726882538.06988: waiting for pending results... 28011 1726882538.07145: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'ethtest0' 28011 1726882538.07203: in run() - task 12673a56-9f93-962d-7c65-0000000003e1 28011 1726882538.07222: variable 'ansible_search_path' from source: unknown 28011 1726882538.07225: variable 'ansible_search_path' from source: unknown 28011 1726882538.07248: calling self._execute() 28011 1726882538.07350: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882538.07353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882538.07356: variable 'omit' from source: magic vars 28011 1726882538.07669: variable 'ansible_distribution_major_version' from source: facts 28011 1726882538.07898: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882538.07901: variable 'omit' from source: magic vars 28011 1726882538.07904: variable 'omit' from source: magic vars 28011 1726882538.07905: variable 'interface' from source: set_fact 28011 1726882538.07907: variable 'omit' from source: magic vars 28011 1726882538.07909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882538.07913: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882538.07944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882538.07966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882538.07984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882538.08021: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882538.08031: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882538.08038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882538.08143: Set connection var ansible_connection to ssh 28011 1726882538.08161: Set connection var ansible_pipelining to False 28011 1726882538.08167: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882538.08172: Set connection var ansible_shell_executable to /bin/sh 28011 1726882538.08179: Set connection var ansible_timeout to 10 28011 1726882538.08183: Set connection var ansible_shell_type to sh 28011 1726882538.08209: variable 'ansible_shell_executable' from source: unknown 28011 1726882538.08212: variable 'ansible_connection' from source: unknown 28011 1726882538.08215: variable 'ansible_module_compression' from source: unknown 28011 1726882538.08217: variable 'ansible_shell_type' from source: unknown 28011 1726882538.08219: variable 'ansible_shell_executable' from source: unknown 28011 1726882538.08221: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882538.08237: variable 'ansible_pipelining' from source: unknown 28011 1726882538.08240: variable 'ansible_timeout' from source: unknown 28011 1726882538.08242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882538.08361: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882538.08381: variable 'omit' from source: magic vars 28011 1726882538.08389: starting attempt loop 28011 1726882538.08448: running the handler 28011 1726882538.08537: variable 'interface_stat' from source: set_fact 28011 1726882538.08562: Evaluated conditional (interface_stat.stat.exists): True 28011 1726882538.08572: handler run complete 28011 1726882538.08590: attempt loop complete, returning result 28011 1726882538.08635: _execute() done 28011 1726882538.08643: dumping result to json 28011 1726882538.08651: done dumping result, returning 28011 1726882538.08663: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'ethtest0' [12673a56-9f93-962d-7c65-0000000003e1] 28011 1726882538.08673: sending task result for task 12673a56-9f93-962d-7c65-0000000003e1 28011 1726882538.08887: done sending task result for task 12673a56-9f93-962d-7c65-0000000003e1 28011 1726882538.08892: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 28011 1726882538.08947: no more pending results, returning what we have 28011 1726882538.08950: results queue empty 28011 1726882538.08951: checking for any_errors_fatal 28011 1726882538.08961: done checking for any_errors_fatal 28011 1726882538.08962: checking for max_fail_percentage 28011 1726882538.08964: done checking for max_fail_percentage 28011 1726882538.08965: checking to see if all hosts have failed and the running result is not ok 28011 1726882538.08966: done checking to see if all hosts have failed 28011 1726882538.08966: getting the remaining hosts for this loop 28011 1726882538.08968: done getting the remaining hosts for this loop 28011 1726882538.08972: getting the next task for host managed_node1 28011 1726882538.08979: done getting next task for host managed_node1 28011 1726882538.08984: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28011 1726882538.08987: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882538.09005: getting variables 28011 1726882538.09007: in VariableManager get_vars() 28011 1726882538.09160: Calling all_inventory to load vars for managed_node1 28011 1726882538.09163: Calling groups_inventory to load vars for managed_node1 28011 1726882538.09166: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882538.09174: Calling all_plugins_play to load vars for managed_node1 28011 1726882538.09177: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882538.09180: Calling groups_plugins_play to load vars for managed_node1 28011 1726882538.09350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882538.09477: done with get_vars() 28011 1726882538.09485: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:35:38 -0400 (0:00:00.027) 0:00:07.646 ****** 28011 1726882538.09550: entering _queue_task() for managed_node1/include_tasks 28011 1726882538.09730: worker is 1 (out of 1 available) 28011 1726882538.09743: exiting _queue_task() for managed_node1/include_tasks 28011 1726882538.09754: done queuing things up, now waiting for results queue to drain 28011 1726882538.09755: waiting for pending results... 28011 1726882538.09916: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28011 1726882538.09989: in run() - task 12673a56-9f93-962d-7c65-000000000016 28011 1726882538.09999: variable 'ansible_search_path' from source: unknown 28011 1726882538.10003: variable 'ansible_search_path' from source: unknown 28011 1726882538.10036: calling self._execute() 28011 1726882538.10094: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882538.10107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882538.10115: variable 'omit' from source: magic vars 28011 1726882538.10367: variable 'ansible_distribution_major_version' from source: facts 28011 1726882538.10377: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882538.10383: _execute() done 28011 1726882538.10386: dumping result to json 28011 1726882538.10389: done dumping result, returning 28011 1726882538.10397: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-962d-7c65-000000000016] 28011 1726882538.10403: sending task result for task 12673a56-9f93-962d-7c65-000000000016 28011 1726882538.10481: done sending task result for task 12673a56-9f93-962d-7c65-000000000016 28011 1726882538.10484: WORKER PROCESS EXITING 28011 1726882538.10557: no more pending results, returning what we have 28011 1726882538.10561: in VariableManager get_vars() 28011 1726882538.10599: Calling all_inventory to load vars for managed_node1 28011 1726882538.10601: Calling groups_inventory to load vars for managed_node1 28011 1726882538.10603: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882538.10609: Calling all_plugins_play to load vars for managed_node1 28011 1726882538.10611: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882538.10612: Calling groups_plugins_play to load vars for managed_node1 28011 1726882538.10752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882538.10867: done with get_vars() 28011 1726882538.10873: variable 'ansible_search_path' from source: unknown 28011 1726882538.10874: variable 'ansible_search_path' from source: unknown 28011 1726882538.10903: we have included files to process 28011 1726882538.10904: generating all_blocks data 28011 1726882538.10905: done generating all_blocks data 28011 1726882538.10908: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28011 1726882538.10909: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28011 1726882538.10910: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28011 1726882538.11548: done processing included file 28011 1726882538.11550: iterating over new_blocks loaded from include file 28011 1726882538.11551: in VariableManager get_vars() 28011 1726882538.11573: done with get_vars() 28011 1726882538.11575: filtering new block on tags 28011 1726882538.11589: done filtering new block on tags 28011 1726882538.11596: in VariableManager get_vars() 28011 1726882538.11617: done with get_vars() 28011 1726882538.11618: filtering new block on tags 28011 1726882538.11636: done filtering new block on tags 28011 1726882538.11639: in VariableManager get_vars() 28011 1726882538.11660: done with get_vars() 28011 1726882538.11662: filtering new block on tags 28011 1726882538.11678: done filtering new block on tags 28011 1726882538.11679: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 28011 1726882538.11684: extending task lists for all hosts with included blocks 28011 1726882538.12284: done extending task lists 28011 1726882538.12286: done processing included files 28011 1726882538.12286: results queue empty 28011 1726882538.12287: checking for any_errors_fatal 28011 1726882538.12288: done checking for any_errors_fatal 28011 1726882538.12289: checking for max_fail_percentage 28011 1726882538.12291: done checking for max_fail_percentage 28011 1726882538.12292: checking to see if all hosts have failed and the running result is not ok 28011 1726882538.12292: done checking to see if all hosts have failed 28011 1726882538.12294: getting the remaining hosts for this loop 28011 1726882538.12295: done getting the remaining hosts for this loop 28011 1726882538.12297: getting the next task for host managed_node1 28011 1726882538.12299: done getting next task for host managed_node1 28011 1726882538.12301: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28011 1726882538.12303: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882538.12309: getting variables 28011 1726882538.12310: in VariableManager get_vars() 28011 1726882538.12319: Calling all_inventory to load vars for managed_node1 28011 1726882538.12321: Calling groups_inventory to load vars for managed_node1 28011 1726882538.12322: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882538.12325: Calling all_plugins_play to load vars for managed_node1 28011 1726882538.12326: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882538.12328: Calling groups_plugins_play to load vars for managed_node1 28011 1726882538.12426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882538.12539: done with get_vars() 28011 1726882538.12545: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:35:38 -0400 (0:00:00.030) 0:00:07.677 ****** 28011 1726882538.12589: entering _queue_task() for managed_node1/setup 28011 1726882538.12778: worker is 1 (out of 1 available) 28011 1726882538.12795: exiting _queue_task() for managed_node1/setup 28011 1726882538.12807: done queuing things up, now waiting for results queue to drain 28011 1726882538.12809: waiting for pending results... 28011 1726882538.12983: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28011 1726882538.13066: in run() - task 12673a56-9f93-962d-7c65-000000000517 28011 1726882538.13077: variable 'ansible_search_path' from source: unknown 28011 1726882538.13080: variable 'ansible_search_path' from source: unknown 28011 1726882538.13112: calling self._execute() 28011 1726882538.13171: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882538.13175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882538.13184: variable 'omit' from source: magic vars 28011 1726882538.13442: variable 'ansible_distribution_major_version' from source: facts 28011 1726882538.13451: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882538.13589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882538.15206: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882538.15250: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882538.15276: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882538.15305: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882538.15329: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882538.15380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882538.15404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882538.15421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882538.15451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882538.15461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882538.15500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882538.15517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882538.15533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882538.15561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882538.15571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882538.15670: variable '__network_required_facts' from source: role '' defaults 28011 1726882538.15676: variable 'ansible_facts' from source: unknown 28011 1726882538.15729: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28011 1726882538.15733: when evaluation is False, skipping this task 28011 1726882538.15736: _execute() done 28011 1726882538.15738: dumping result to json 28011 1726882538.15740: done dumping result, returning 28011 1726882538.15745: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-962d-7c65-000000000517] 28011 1726882538.15748: sending task result for task 12673a56-9f93-962d-7c65-000000000517 28011 1726882538.15825: done sending task result for task 12673a56-9f93-962d-7c65-000000000517 28011 1726882538.15827: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882538.15901: no more pending results, returning what we have 28011 1726882538.15904: results queue empty 28011 1726882538.15905: checking for any_errors_fatal 28011 1726882538.15906: done checking for any_errors_fatal 28011 1726882538.15907: checking for max_fail_percentage 28011 1726882538.15908: done checking for max_fail_percentage 28011 1726882538.15909: checking to see if all hosts have failed and the running result is not ok 28011 1726882538.15910: done checking to see if all hosts have failed 28011 1726882538.15910: getting the remaining hosts for this loop 28011 1726882538.15912: done getting the remaining hosts for this loop 28011 1726882538.15915: getting the next task for host managed_node1 28011 1726882538.15922: done getting next task for host managed_node1 28011 1726882538.15925: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28011 1726882538.15928: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882538.15940: getting variables 28011 1726882538.15941: in VariableManager get_vars() 28011 1726882538.15974: Calling all_inventory to load vars for managed_node1 28011 1726882538.15976: Calling groups_inventory to load vars for managed_node1 28011 1726882538.15978: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882538.15986: Calling all_plugins_play to load vars for managed_node1 28011 1726882538.15988: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882538.15990: Calling groups_plugins_play to load vars for managed_node1 28011 1726882538.16102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882538.16241: done with get_vars() 28011 1726882538.16248: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:35:38 -0400 (0:00:00.037) 0:00:07.714 ****** 28011 1726882538.16319: entering _queue_task() for managed_node1/stat 28011 1726882538.16499: worker is 1 (out of 1 available) 28011 1726882538.16512: exiting _queue_task() for managed_node1/stat 28011 1726882538.16524: done queuing things up, now waiting for results queue to drain 28011 1726882538.16525: waiting for pending results... 28011 1726882538.16679: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 28011 1726882538.16763: in run() - task 12673a56-9f93-962d-7c65-000000000519 28011 1726882538.16774: variable 'ansible_search_path' from source: unknown 28011 1726882538.16778: variable 'ansible_search_path' from source: unknown 28011 1726882538.16809: calling self._execute() 28011 1726882538.16869: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882538.16872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882538.16881: variable 'omit' from source: magic vars 28011 1726882538.17140: variable 'ansible_distribution_major_version' from source: facts 28011 1726882538.17150: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882538.17260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882538.17447: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882538.17479: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882538.17505: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882538.17532: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882538.17596: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882538.17612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882538.17632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882538.17649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882538.17712: variable '__network_is_ostree' from source: set_fact 28011 1726882538.17718: Evaluated conditional (not __network_is_ostree is defined): False 28011 1726882538.17721: when evaluation is False, skipping this task 28011 1726882538.17723: _execute() done 28011 1726882538.17727: dumping result to json 28011 1726882538.17729: done dumping result, returning 28011 1726882538.17737: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-962d-7c65-000000000519] 28011 1726882538.17741: sending task result for task 12673a56-9f93-962d-7c65-000000000519 28011 1726882538.17819: done sending task result for task 12673a56-9f93-962d-7c65-000000000519 28011 1726882538.17822: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28011 1726882538.17883: no more pending results, returning what we have 28011 1726882538.17886: results queue empty 28011 1726882538.17887: checking for any_errors_fatal 28011 1726882538.17896: done checking for any_errors_fatal 28011 1726882538.17897: checking for max_fail_percentage 28011 1726882538.17899: done checking for max_fail_percentage 28011 1726882538.17899: checking to see if all hosts have failed and the running result is not ok 28011 1726882538.17900: done checking to see if all hosts have failed 28011 1726882538.17901: getting the remaining hosts for this loop 28011 1726882538.17903: done getting the remaining hosts for this loop 28011 1726882538.17906: getting the next task for host managed_node1 28011 1726882538.17911: done getting next task for host managed_node1 28011 1726882538.17914: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28011 1726882538.17918: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882538.17930: getting variables 28011 1726882538.17933: in VariableManager get_vars() 28011 1726882538.17963: Calling all_inventory to load vars for managed_node1 28011 1726882538.17965: Calling groups_inventory to load vars for managed_node1 28011 1726882538.17968: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882538.17975: Calling all_plugins_play to load vars for managed_node1 28011 1726882538.17978: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882538.17980: Calling groups_plugins_play to load vars for managed_node1 28011 1726882538.18090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882538.18215: done with get_vars() 28011 1726882538.18223: done getting variables 28011 1726882538.18259: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:35:38 -0400 (0:00:00.019) 0:00:07.734 ****** 28011 1726882538.18280: entering _queue_task() for managed_node1/set_fact 28011 1726882538.18465: worker is 1 (out of 1 available) 28011 1726882538.18478: exiting _queue_task() for managed_node1/set_fact 28011 1726882538.18490: done queuing things up, now waiting for results queue to drain 28011 1726882538.18492: waiting for pending results... 28011 1726882538.18657: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28011 1726882538.18749: in run() - task 12673a56-9f93-962d-7c65-00000000051a 28011 1726882538.18760: variable 'ansible_search_path' from source: unknown 28011 1726882538.18764: variable 'ansible_search_path' from source: unknown 28011 1726882538.18789: calling self._execute() 28011 1726882538.18854: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882538.18857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882538.18867: variable 'omit' from source: magic vars 28011 1726882538.19126: variable 'ansible_distribution_major_version' from source: facts 28011 1726882538.19135: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882538.19245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882538.19694: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882538.19730: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882538.19753: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882538.19776: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882538.19841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882538.19857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882538.19875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882538.19892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882538.19953: variable '__network_is_ostree' from source: set_fact 28011 1726882538.19959: Evaluated conditional (not __network_is_ostree is defined): False 28011 1726882538.19962: when evaluation is False, skipping this task 28011 1726882538.19965: _execute() done 28011 1726882538.19967: dumping result to json 28011 1726882538.19969: done dumping result, returning 28011 1726882538.19977: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-962d-7c65-00000000051a] 28011 1726882538.19982: sending task result for task 12673a56-9f93-962d-7c65-00000000051a 28011 1726882538.20061: done sending task result for task 12673a56-9f93-962d-7c65-00000000051a 28011 1726882538.20064: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28011 1726882538.20110: no more pending results, returning what we have 28011 1726882538.20113: results queue empty 28011 1726882538.20114: checking for any_errors_fatal 28011 1726882538.20118: done checking for any_errors_fatal 28011 1726882538.20119: checking for max_fail_percentage 28011 1726882538.20121: done checking for max_fail_percentage 28011 1726882538.20122: checking to see if all hosts have failed and the running result is not ok 28011 1726882538.20123: done checking to see if all hosts have failed 28011 1726882538.20123: getting the remaining hosts for this loop 28011 1726882538.20125: done getting the remaining hosts for this loop 28011 1726882538.20128: getting the next task for host managed_node1 28011 1726882538.20136: done getting next task for host managed_node1 28011 1726882538.20139: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28011 1726882538.20142: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882538.20153: getting variables 28011 1726882538.20155: in VariableManager get_vars() 28011 1726882538.20191: Calling all_inventory to load vars for managed_node1 28011 1726882538.20200: Calling groups_inventory to load vars for managed_node1 28011 1726882538.20203: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882538.20210: Calling all_plugins_play to load vars for managed_node1 28011 1726882538.20213: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882538.20215: Calling groups_plugins_play to load vars for managed_node1 28011 1726882538.20518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882538.20637: done with get_vars() 28011 1726882538.20644: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:35:38 -0400 (0:00:00.024) 0:00:07.758 ****** 28011 1726882538.20703: entering _queue_task() for managed_node1/service_facts 28011 1726882538.20705: Creating lock for service_facts 28011 1726882538.20891: worker is 1 (out of 1 available) 28011 1726882538.20906: exiting _queue_task() for managed_node1/service_facts 28011 1726882538.20919: done queuing things up, now waiting for results queue to drain 28011 1726882538.20920: waiting for pending results... 28011 1726882538.21076: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 28011 1726882538.21170: in run() - task 12673a56-9f93-962d-7c65-00000000051c 28011 1726882538.21181: variable 'ansible_search_path' from source: unknown 28011 1726882538.21184: variable 'ansible_search_path' from source: unknown 28011 1726882538.21216: calling self._execute() 28011 1726882538.21274: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882538.21278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882538.21287: variable 'omit' from source: magic vars 28011 1726882538.21550: variable 'ansible_distribution_major_version' from source: facts 28011 1726882538.21560: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882538.21565: variable 'omit' from source: magic vars 28011 1726882538.21615: variable 'omit' from source: magic vars 28011 1726882538.21639: variable 'omit' from source: magic vars 28011 1726882538.21668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882538.21697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882538.21718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882538.21731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882538.21741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882538.21765: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882538.21768: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882538.21770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882538.21845: Set connection var ansible_connection to ssh 28011 1726882538.21852: Set connection var ansible_pipelining to False 28011 1726882538.21857: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882538.21863: Set connection var ansible_shell_executable to /bin/sh 28011 1726882538.21870: Set connection var ansible_timeout to 10 28011 1726882538.21875: Set connection var ansible_shell_type to sh 28011 1726882538.21895: variable 'ansible_shell_executable' from source: unknown 28011 1726882538.21899: variable 'ansible_connection' from source: unknown 28011 1726882538.21904: variable 'ansible_module_compression' from source: unknown 28011 1726882538.21907: variable 'ansible_shell_type' from source: unknown 28011 1726882538.21911: variable 'ansible_shell_executable' from source: unknown 28011 1726882538.21914: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882538.21916: variable 'ansible_pipelining' from source: unknown 28011 1726882538.21918: variable 'ansible_timeout' from source: unknown 28011 1726882538.21920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882538.22056: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882538.22063: variable 'omit' from source: magic vars 28011 1726882538.22067: starting attempt loop 28011 1726882538.22070: running the handler 28011 1726882538.22082: _low_level_execute_command(): starting 28011 1726882538.22092: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882538.22597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882538.22601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882538.22604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882538.22662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882538.22665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882538.22667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882538.22739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882538.24346: stdout chunk (state=3): >>>/root <<< 28011 1726882538.24455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882538.24479: stderr chunk (state=3): >>><<< 28011 1726882538.24482: stdout chunk (state=3): >>><<< 28011 1726882538.24506: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882538.24517: _low_level_execute_command(): starting 28011 1726882538.24523: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412 `" && echo ansible-tmp-1726882538.2450466-28458-105949153594412="` echo /root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412 `" ) && sleep 0' 28011 1726882538.24961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882538.24966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882538.24969: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882538.24980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882538.24983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882538.25028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882538.25035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882538.25037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882538.25078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882538.26944: stdout chunk (state=3): >>>ansible-tmp-1726882538.2450466-28458-105949153594412=/root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412 <<< 28011 1726882538.27051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882538.27077: stderr chunk (state=3): >>><<< 28011 1726882538.27081: stdout chunk (state=3): >>><<< 28011 1726882538.27098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882538.2450466-28458-105949153594412=/root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882538.27137: variable 'ansible_module_compression' from source: unknown 28011 1726882538.27199: ANSIBALLZ: Using lock for service_facts 28011 1726882538.27202: ANSIBALLZ: Acquiring lock 28011 1726882538.27205: ANSIBALLZ: Lock acquired: 139767585055392 28011 1726882538.27207: ANSIBALLZ: Creating module 28011 1726882538.42196: ANSIBALLZ: Writing module into payload 28011 1726882538.42397: ANSIBALLZ: Writing module 28011 1726882538.42421: ANSIBALLZ: Renaming module 28011 1726882538.42428: ANSIBALLZ: Done creating module 28011 1726882538.42446: variable 'ansible_facts' from source: unknown 28011 1726882538.42736: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412/AnsiballZ_service_facts.py 28011 1726882538.42986: Sending initial data 28011 1726882538.43042: Sent initial data (162 bytes) 28011 1726882538.44911: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882538.44955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882538.45031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882538.46580: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28011 1726882538.46605: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882538.46665: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882538.46729: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp0pd7d1yr /root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412/AnsiballZ_service_facts.py <<< 28011 1726882538.46745: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412/AnsiballZ_service_facts.py" <<< 28011 1726882538.46969: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp0pd7d1yr" to remote "/root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412/AnsiballZ_service_facts.py" <<< 28011 1726882538.48712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882538.48745: stderr chunk (state=3): >>><<< 28011 1726882538.48756: stdout chunk (state=3): >>><<< 28011 1726882538.48907: done transferring module to remote 28011 1726882538.48924: _low_level_execute_command(): starting 28011 1726882538.48973: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412/ /root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412/AnsiballZ_service_facts.py && sleep 0' 28011 1726882538.50308: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882538.50321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882538.50332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882538.50382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882538.50810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882538.50884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882538.52616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882538.52648: stderr chunk (state=3): >>><<< 28011 1726882538.52658: stdout chunk (state=3): >>><<< 28011 1726882538.52685: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882538.52699: _low_level_execute_command(): starting 28011 1726882538.52731: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412/AnsiballZ_service_facts.py && sleep 0' 28011 1726882538.54329: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882538.54343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882538.54369: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882538.54411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882538.54471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882538.54700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882540.06664: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 28011 1726882540.06679: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 28011 1726882540.06703: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 28011 1726882540.06725: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 28011 1726882540.06749: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 28011 1726882540.06756: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source"<<< 28011 1726882540.06759: stdout chunk (state=3): >>>: "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28011 1726882540.08404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882540.08408: stdout chunk (state=3): >>><<< 28011 1726882540.08410: stderr chunk (state=3): >>><<< 28011 1726882540.08419: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882540.09011: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882540.09026: _low_level_execute_command(): starting 28011 1726882540.09037: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882538.2450466-28458-105949153594412/ > /dev/null 2>&1 && sleep 0' 28011 1726882540.09678: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882540.09701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882540.09762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882540.09824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882540.09878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882540.09920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882540.11791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882540.11800: stdout chunk (state=3): >>><<< 28011 1726882540.11810: stderr chunk (state=3): >>><<< 28011 1726882540.12001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882540.12005: handler run complete 28011 1726882540.12046: variable 'ansible_facts' from source: unknown 28011 1726882540.12217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882540.12553: variable 'ansible_facts' from source: unknown 28011 1726882540.12643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882540.12759: attempt loop complete, returning result 28011 1726882540.12764: _execute() done 28011 1726882540.12767: dumping result to json 28011 1726882540.12810: done dumping result, returning 28011 1726882540.12820: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-962d-7c65-00000000051c] 28011 1726882540.12823: sending task result for task 12673a56-9f93-962d-7c65-00000000051c ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882540.13376: no more pending results, returning what we have 28011 1726882540.13379: results queue empty 28011 1726882540.13379: checking for any_errors_fatal 28011 1726882540.13383: done checking for any_errors_fatal 28011 1726882540.13383: checking for max_fail_percentage 28011 1726882540.13385: done checking for max_fail_percentage 28011 1726882540.13385: checking to see if all hosts have failed and the running result is not ok 28011 1726882540.13386: done checking to see if all hosts have failed 28011 1726882540.13387: getting the remaining hosts for this loop 28011 1726882540.13388: done getting the remaining hosts for this loop 28011 1726882540.13391: getting the next task for host managed_node1 28011 1726882540.13398: done getting next task for host managed_node1 28011 1726882540.13401: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28011 1726882540.13408: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882540.13418: getting variables 28011 1726882540.13419: in VariableManager get_vars() 28011 1726882540.13451: Calling all_inventory to load vars for managed_node1 28011 1726882540.13453: Calling groups_inventory to load vars for managed_node1 28011 1726882540.13455: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882540.13463: Calling all_plugins_play to load vars for managed_node1 28011 1726882540.13465: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882540.13467: Calling groups_plugins_play to load vars for managed_node1 28011 1726882540.13721: done sending task result for task 12673a56-9f93-962d-7c65-00000000051c 28011 1726882540.13725: WORKER PROCESS EXITING 28011 1726882540.13736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882540.14031: done with get_vars() 28011 1726882540.14041: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:35:40 -0400 (0:00:01.934) 0:00:09.692 ****** 28011 1726882540.14111: entering _queue_task() for managed_node1/package_facts 28011 1726882540.14112: Creating lock for package_facts 28011 1726882540.14320: worker is 1 (out of 1 available) 28011 1726882540.14332: exiting _queue_task() for managed_node1/package_facts 28011 1726882540.14345: done queuing things up, now waiting for results queue to drain 28011 1726882540.14346: waiting for pending results... 28011 1726882540.14515: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 28011 1726882540.14605: in run() - task 12673a56-9f93-962d-7c65-00000000051d 28011 1726882540.14616: variable 'ansible_search_path' from source: unknown 28011 1726882540.14620: variable 'ansible_search_path' from source: unknown 28011 1726882540.14649: calling self._execute() 28011 1726882540.14713: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882540.14716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882540.14725: variable 'omit' from source: magic vars 28011 1726882540.14998: variable 'ansible_distribution_major_version' from source: facts 28011 1726882540.15012: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882540.15017: variable 'omit' from source: magic vars 28011 1726882540.15064: variable 'omit' from source: magic vars 28011 1726882540.15089: variable 'omit' from source: magic vars 28011 1726882540.15123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882540.15150: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882540.15165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882540.15178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882540.15187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882540.15213: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882540.15217: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882540.15221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882540.15291: Set connection var ansible_connection to ssh 28011 1726882540.15296: Set connection var ansible_pipelining to False 28011 1726882540.15303: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882540.15308: Set connection var ansible_shell_executable to /bin/sh 28011 1726882540.15314: Set connection var ansible_timeout to 10 28011 1726882540.15319: Set connection var ansible_shell_type to sh 28011 1726882540.15338: variable 'ansible_shell_executable' from source: unknown 28011 1726882540.15340: variable 'ansible_connection' from source: unknown 28011 1726882540.15343: variable 'ansible_module_compression' from source: unknown 28011 1726882540.15345: variable 'ansible_shell_type' from source: unknown 28011 1726882540.15348: variable 'ansible_shell_executable' from source: unknown 28011 1726882540.15351: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882540.15355: variable 'ansible_pipelining' from source: unknown 28011 1726882540.15357: variable 'ansible_timeout' from source: unknown 28011 1726882540.15363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882540.15504: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882540.15512: variable 'omit' from source: magic vars 28011 1726882540.15515: starting attempt loop 28011 1726882540.15518: running the handler 28011 1726882540.15531: _low_level_execute_command(): starting 28011 1726882540.15538: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882540.16051: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882540.16054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882540.16057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882540.16059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882540.16099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882540.16122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882540.16168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882540.17741: stdout chunk (state=3): >>>/root <<< 28011 1726882540.17845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882540.17873: stderr chunk (state=3): >>><<< 28011 1726882540.17877: stdout chunk (state=3): >>><<< 28011 1726882540.17897: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882540.17909: _low_level_execute_command(): starting 28011 1726882540.17915: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264 `" && echo ansible-tmp-1726882540.1789732-28538-274137721299264="` echo /root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264 `" ) && sleep 0' 28011 1726882540.18329: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882540.18332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882540.18343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882540.18412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882540.18415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882540.18452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882540.20303: stdout chunk (state=3): >>>ansible-tmp-1726882540.1789732-28538-274137721299264=/root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264 <<< 28011 1726882540.20403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882540.20431: stderr chunk (state=3): >>><<< 28011 1726882540.20436: stdout chunk (state=3): >>><<< 28011 1726882540.20447: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882540.1789732-28538-274137721299264=/root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882540.20480: variable 'ansible_module_compression' from source: unknown 28011 1726882540.20522: ANSIBALLZ: Using lock for package_facts 28011 1726882540.20525: ANSIBALLZ: Acquiring lock 28011 1726882540.20528: ANSIBALLZ: Lock acquired: 139767561852432 28011 1726882540.20530: ANSIBALLZ: Creating module 28011 1726882540.44600: ANSIBALLZ: Writing module into payload 28011 1726882540.44604: ANSIBALLZ: Writing module 28011 1726882540.44607: ANSIBALLZ: Renaming module 28011 1726882540.44609: ANSIBALLZ: Done creating module 28011 1726882540.44639: variable 'ansible_facts' from source: unknown 28011 1726882540.44830: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264/AnsiballZ_package_facts.py 28011 1726882540.45076: Sending initial data 28011 1726882540.45086: Sent initial data (162 bytes) 28011 1726882540.45711: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882540.45734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882540.45753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882540.45832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882540.47485: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882540.47558: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882540.47607: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp9t8rwzki /root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264/AnsiballZ_package_facts.py <<< 28011 1726882540.47623: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264/AnsiballZ_package_facts.py" <<< 28011 1726882540.47675: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp9t8rwzki" to remote "/root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264/AnsiballZ_package_facts.py" <<< 28011 1726882540.49403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882540.49496: stderr chunk (state=3): >>><<< 28011 1726882540.49509: stdout chunk (state=3): >>><<< 28011 1726882540.49572: done transferring module to remote 28011 1726882540.49587: _low_level_execute_command(): starting 28011 1726882540.49601: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264/ /root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264/AnsiballZ_package_facts.py && sleep 0' 28011 1726882540.50251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882540.50265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882540.50277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882540.50301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882540.50318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882540.50328: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882540.50355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882540.50446: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882540.50476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882540.50544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882540.52370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882540.52381: stdout chunk (state=3): >>><<< 28011 1726882540.52397: stderr chunk (state=3): >>><<< 28011 1726882540.52420: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882540.52437: _low_level_execute_command(): starting 28011 1726882540.52515: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264/AnsiballZ_package_facts.py && sleep 0' 28011 1726882540.53042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882540.53056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882540.53072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882540.53092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882540.53113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882540.53215: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882540.53231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882540.53251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882540.53326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882540.96911: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 28011 1726882540.97024: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 28011 1726882540.97096: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 28011 1726882540.97116: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28011 1726882540.98928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882540.98931: stdout chunk (state=3): >>><<< 28011 1726882540.98933: stderr chunk (state=3): >>><<< 28011 1726882540.99209: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882541.05516: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882541.05552: _low_level_execute_command(): starting 28011 1726882541.05647: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882540.1789732-28538-274137721299264/ > /dev/null 2>&1 && sleep 0' 28011 1726882541.06964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882541.07101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882541.07180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882541.07210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882541.07348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882541.09500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882541.09504: stdout chunk (state=3): >>><<< 28011 1726882541.09506: stderr chunk (state=3): >>><<< 28011 1726882541.09509: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882541.09511: handler run complete 28011 1726882541.10306: variable 'ansible_facts' from source: unknown 28011 1726882541.10925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882541.22974: variable 'ansible_facts' from source: unknown 28011 1726882541.24200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882541.25296: attempt loop complete, returning result 28011 1726882541.25315: _execute() done 28011 1726882541.25598: dumping result to json 28011 1726882541.25694: done dumping result, returning 28011 1726882541.25998: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-962d-7c65-00000000051d] 28011 1726882541.26001: sending task result for task 12673a56-9f93-962d-7c65-00000000051d 28011 1726882541.29953: done sending task result for task 12673a56-9f93-962d-7c65-00000000051d 28011 1726882541.29957: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882541.30057: no more pending results, returning what we have 28011 1726882541.30059: results queue empty 28011 1726882541.30060: checking for any_errors_fatal 28011 1726882541.30064: done checking for any_errors_fatal 28011 1726882541.30064: checking for max_fail_percentage 28011 1726882541.30066: done checking for max_fail_percentage 28011 1726882541.30067: checking to see if all hosts have failed and the running result is not ok 28011 1726882541.30068: done checking to see if all hosts have failed 28011 1726882541.30068: getting the remaining hosts for this loop 28011 1726882541.30069: done getting the remaining hosts for this loop 28011 1726882541.30073: getting the next task for host managed_node1 28011 1726882541.30079: done getting next task for host managed_node1 28011 1726882541.30082: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28011 1726882541.30084: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882541.30098: getting variables 28011 1726882541.30100: in VariableManager get_vars() 28011 1726882541.30131: Calling all_inventory to load vars for managed_node1 28011 1726882541.30134: Calling groups_inventory to load vars for managed_node1 28011 1726882541.30137: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882541.30145: Calling all_plugins_play to load vars for managed_node1 28011 1726882541.30148: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882541.30151: Calling groups_plugins_play to load vars for managed_node1 28011 1726882541.32565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882541.35735: done with get_vars() 28011 1726882541.35761: done getting variables 28011 1726882541.36031: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:35:41 -0400 (0:00:01.219) 0:00:10.911 ****** 28011 1726882541.36067: entering _queue_task() for managed_node1/debug 28011 1726882541.36834: worker is 1 (out of 1 available) 28011 1726882541.36846: exiting _queue_task() for managed_node1/debug 28011 1726882541.36858: done queuing things up, now waiting for results queue to drain 28011 1726882541.36860: waiting for pending results... 28011 1726882541.37311: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 28011 1726882541.37646: in run() - task 12673a56-9f93-962d-7c65-000000000017 28011 1726882541.37650: variable 'ansible_search_path' from source: unknown 28011 1726882541.37652: variable 'ansible_search_path' from source: unknown 28011 1726882541.37655: calling self._execute() 28011 1726882541.37759: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882541.37764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882541.37776: variable 'omit' from source: magic vars 28011 1726882541.38746: variable 'ansible_distribution_major_version' from source: facts 28011 1726882541.38828: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882541.38835: variable 'omit' from source: magic vars 28011 1726882541.38942: variable 'omit' from source: magic vars 28011 1726882541.39401: variable 'network_provider' from source: set_fact 28011 1726882541.39404: variable 'omit' from source: magic vars 28011 1726882541.39408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882541.39411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882541.39413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882541.39416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882541.39418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882541.39420: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882541.39422: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882541.39424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882541.39509: Set connection var ansible_connection to ssh 28011 1726882541.39512: Set connection var ansible_pipelining to False 28011 1726882541.39514: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882541.39516: Set connection var ansible_shell_executable to /bin/sh 28011 1726882541.39518: Set connection var ansible_timeout to 10 28011 1726882541.39520: Set connection var ansible_shell_type to sh 28011 1726882541.39523: variable 'ansible_shell_executable' from source: unknown 28011 1726882541.39525: variable 'ansible_connection' from source: unknown 28011 1726882541.39528: variable 'ansible_module_compression' from source: unknown 28011 1726882541.39530: variable 'ansible_shell_type' from source: unknown 28011 1726882541.39532: variable 'ansible_shell_executable' from source: unknown 28011 1726882541.39534: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882541.39539: variable 'ansible_pipelining' from source: unknown 28011 1726882541.39541: variable 'ansible_timeout' from source: unknown 28011 1726882541.39546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882541.39700: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882541.39705: variable 'omit' from source: magic vars 28011 1726882541.39708: starting attempt loop 28011 1726882541.39711: running the handler 28011 1726882541.39778: handler run complete 28011 1726882541.39797: attempt loop complete, returning result 28011 1726882541.39800: _execute() done 28011 1726882541.39803: dumping result to json 28011 1726882541.39805: done dumping result, returning 28011 1726882541.39815: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-962d-7c65-000000000017] 28011 1726882541.39820: sending task result for task 12673a56-9f93-962d-7c65-000000000017 28011 1726882541.39913: done sending task result for task 12673a56-9f93-962d-7c65-000000000017 28011 1726882541.39915: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 28011 1726882541.39987: no more pending results, returning what we have 28011 1726882541.39994: results queue empty 28011 1726882541.39995: checking for any_errors_fatal 28011 1726882541.40005: done checking for any_errors_fatal 28011 1726882541.40006: checking for max_fail_percentage 28011 1726882541.40007: done checking for max_fail_percentage 28011 1726882541.40008: checking to see if all hosts have failed and the running result is not ok 28011 1726882541.40009: done checking to see if all hosts have failed 28011 1726882541.40010: getting the remaining hosts for this loop 28011 1726882541.40011: done getting the remaining hosts for this loop 28011 1726882541.40014: getting the next task for host managed_node1 28011 1726882541.40021: done getting next task for host managed_node1 28011 1726882541.40024: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28011 1726882541.40027: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882541.40037: getting variables 28011 1726882541.40039: in VariableManager get_vars() 28011 1726882541.40076: Calling all_inventory to load vars for managed_node1 28011 1726882541.40078: Calling groups_inventory to load vars for managed_node1 28011 1726882541.40080: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882541.40089: Calling all_plugins_play to load vars for managed_node1 28011 1726882541.40298: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882541.40303: Calling groups_plugins_play to load vars for managed_node1 28011 1726882541.41829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882541.43994: done with get_vars() 28011 1726882541.44019: done getting variables 28011 1726882541.44077: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:35:41 -0400 (0:00:00.080) 0:00:10.992 ****** 28011 1726882541.44152: entering _queue_task() for managed_node1/fail 28011 1726882541.44719: worker is 1 (out of 1 available) 28011 1726882541.44729: exiting _queue_task() for managed_node1/fail 28011 1726882541.44739: done queuing things up, now waiting for results queue to drain 28011 1726882541.44740: waiting for pending results... 28011 1726882541.45010: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28011 1726882541.45017: in run() - task 12673a56-9f93-962d-7c65-000000000018 28011 1726882541.45020: variable 'ansible_search_path' from source: unknown 28011 1726882541.45022: variable 'ansible_search_path' from source: unknown 28011 1726882541.45044: calling self._execute() 28011 1726882541.45136: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882541.45141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882541.45154: variable 'omit' from source: magic vars 28011 1726882541.45547: variable 'ansible_distribution_major_version' from source: facts 28011 1726882541.45559: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882541.45688: variable 'network_state' from source: role '' defaults 28011 1726882541.45699: Evaluated conditional (network_state != {}): False 28011 1726882541.45702: when evaluation is False, skipping this task 28011 1726882541.45705: _execute() done 28011 1726882541.45708: dumping result to json 28011 1726882541.45711: done dumping result, returning 28011 1726882541.45720: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-962d-7c65-000000000018] 28011 1726882541.45736: sending task result for task 12673a56-9f93-962d-7c65-000000000018 28011 1726882541.45828: done sending task result for task 12673a56-9f93-962d-7c65-000000000018 28011 1726882541.45831: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882541.45878: no more pending results, returning what we have 28011 1726882541.45882: results queue empty 28011 1726882541.45883: checking for any_errors_fatal 28011 1726882541.45890: done checking for any_errors_fatal 28011 1726882541.45891: checking for max_fail_percentage 28011 1726882541.45894: done checking for max_fail_percentage 28011 1726882541.45895: checking to see if all hosts have failed and the running result is not ok 28011 1726882541.45896: done checking to see if all hosts have failed 28011 1726882541.45897: getting the remaining hosts for this loop 28011 1726882541.45898: done getting the remaining hosts for this loop 28011 1726882541.45902: getting the next task for host managed_node1 28011 1726882541.45908: done getting next task for host managed_node1 28011 1726882541.45913: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28011 1726882541.45916: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882541.45932: getting variables 28011 1726882541.45934: in VariableManager get_vars() 28011 1726882541.46135: Calling all_inventory to load vars for managed_node1 28011 1726882541.46138: Calling groups_inventory to load vars for managed_node1 28011 1726882541.46141: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882541.46150: Calling all_plugins_play to load vars for managed_node1 28011 1726882541.46153: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882541.46156: Calling groups_plugins_play to load vars for managed_node1 28011 1726882541.48810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882541.50644: done with get_vars() 28011 1726882541.50674: done getting variables 28011 1726882541.50734: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:35:41 -0400 (0:00:00.066) 0:00:11.058 ****** 28011 1726882541.50767: entering _queue_task() for managed_node1/fail 28011 1726882541.51205: worker is 1 (out of 1 available) 28011 1726882541.51223: exiting _queue_task() for managed_node1/fail 28011 1726882541.51235: done queuing things up, now waiting for results queue to drain 28011 1726882541.51236: waiting for pending results... 28011 1726882541.51495: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28011 1726882541.51718: in run() - task 12673a56-9f93-962d-7c65-000000000019 28011 1726882541.51722: variable 'ansible_search_path' from source: unknown 28011 1726882541.51725: variable 'ansible_search_path' from source: unknown 28011 1726882541.51728: calling self._execute() 28011 1726882541.51730: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882541.51733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882541.51744: variable 'omit' from source: magic vars 28011 1726882541.52383: variable 'ansible_distribution_major_version' from source: facts 28011 1726882541.52397: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882541.52798: variable 'network_state' from source: role '' defaults 28011 1726882541.52801: Evaluated conditional (network_state != {}): False 28011 1726882541.52803: when evaluation is False, skipping this task 28011 1726882541.52805: _execute() done 28011 1726882541.52806: dumping result to json 28011 1726882541.52808: done dumping result, returning 28011 1726882541.52810: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-962d-7c65-000000000019] 28011 1726882541.52812: sending task result for task 12673a56-9f93-962d-7c65-000000000019 28011 1726882541.52874: done sending task result for task 12673a56-9f93-962d-7c65-000000000019 28011 1726882541.52878: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882541.52926: no more pending results, returning what we have 28011 1726882541.52930: results queue empty 28011 1726882541.52930: checking for any_errors_fatal 28011 1726882541.52937: done checking for any_errors_fatal 28011 1726882541.52938: checking for max_fail_percentage 28011 1726882541.52939: done checking for max_fail_percentage 28011 1726882541.52940: checking to see if all hosts have failed and the running result is not ok 28011 1726882541.52941: done checking to see if all hosts have failed 28011 1726882541.52942: getting the remaining hosts for this loop 28011 1726882541.52943: done getting the remaining hosts for this loop 28011 1726882541.52947: getting the next task for host managed_node1 28011 1726882541.52959: done getting next task for host managed_node1 28011 1726882541.52963: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28011 1726882541.52966: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882541.52983: getting variables 28011 1726882541.52985: in VariableManager get_vars() 28011 1726882541.53028: Calling all_inventory to load vars for managed_node1 28011 1726882541.53031: Calling groups_inventory to load vars for managed_node1 28011 1726882541.53034: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882541.53045: Calling all_plugins_play to load vars for managed_node1 28011 1726882541.53048: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882541.53051: Calling groups_plugins_play to load vars for managed_node1 28011 1726882541.54836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882541.57088: done with get_vars() 28011 1726882541.57122: done getting variables 28011 1726882541.57181: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:35:41 -0400 (0:00:00.064) 0:00:11.123 ****** 28011 1726882541.57226: entering _queue_task() for managed_node1/fail 28011 1726882541.57579: worker is 1 (out of 1 available) 28011 1726882541.57592: exiting _queue_task() for managed_node1/fail 28011 1726882541.57606: done queuing things up, now waiting for results queue to drain 28011 1726882541.57607: waiting for pending results... 28011 1726882541.58111: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28011 1726882541.58116: in run() - task 12673a56-9f93-962d-7c65-00000000001a 28011 1726882541.58119: variable 'ansible_search_path' from source: unknown 28011 1726882541.58122: variable 'ansible_search_path' from source: unknown 28011 1726882541.58125: calling self._execute() 28011 1726882541.58197: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882541.58202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882541.58213: variable 'omit' from source: magic vars 28011 1726882541.58599: variable 'ansible_distribution_major_version' from source: facts 28011 1726882541.58612: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882541.58797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882541.61087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882541.61171: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882541.61209: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882541.61254: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882541.61280: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882541.61368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882541.61398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882541.61431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.61481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882541.61496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882541.61597: variable 'ansible_distribution_major_version' from source: facts 28011 1726882541.61609: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28011 1726882541.61731: variable 'ansible_distribution' from source: facts 28011 1726882541.61735: variable '__network_rh_distros' from source: role '' defaults 28011 1726882541.61745: Evaluated conditional (ansible_distribution in __network_rh_distros): True 28011 1726882541.62011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882541.62198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882541.62201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.62209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882541.62211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882541.62214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882541.62216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882541.62218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.62254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882541.62267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882541.62309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882541.62332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882541.62355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.62395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882541.62414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882541.62734: variable 'network_connections' from source: task vars 28011 1726882541.62745: variable 'interface' from source: set_fact 28011 1726882541.62811: variable 'interface' from source: set_fact 28011 1726882541.62822: variable 'interface' from source: set_fact 28011 1726882541.62888: variable 'interface' from source: set_fact 28011 1726882541.62906: variable 'network_state' from source: role '' defaults 28011 1726882541.62978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882541.63145: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882541.63186: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882541.63217: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882541.63244: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882541.63296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882541.63314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882541.63372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.63375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882541.63404: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 28011 1726882541.63407: when evaluation is False, skipping this task 28011 1726882541.63409: _execute() done 28011 1726882541.63412: dumping result to json 28011 1726882541.63414: done dumping result, returning 28011 1726882541.63423: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-962d-7c65-00000000001a] 28011 1726882541.63426: sending task result for task 12673a56-9f93-962d-7c65-00000000001a 28011 1726882541.63549: done sending task result for task 12673a56-9f93-962d-7c65-00000000001a 28011 1726882541.63552: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 28011 1726882541.63642: no more pending results, returning what we have 28011 1726882541.63646: results queue empty 28011 1726882541.63646: checking for any_errors_fatal 28011 1726882541.63652: done checking for any_errors_fatal 28011 1726882541.63653: checking for max_fail_percentage 28011 1726882541.63655: done checking for max_fail_percentage 28011 1726882541.63656: checking to see if all hosts have failed and the running result is not ok 28011 1726882541.63656: done checking to see if all hosts have failed 28011 1726882541.63657: getting the remaining hosts for this loop 28011 1726882541.63658: done getting the remaining hosts for this loop 28011 1726882541.63662: getting the next task for host managed_node1 28011 1726882541.63669: done getting next task for host managed_node1 28011 1726882541.63673: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28011 1726882541.63675: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882541.63691: getting variables 28011 1726882541.63695: in VariableManager get_vars() 28011 1726882541.63744: Calling all_inventory to load vars for managed_node1 28011 1726882541.63747: Calling groups_inventory to load vars for managed_node1 28011 1726882541.63749: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882541.63761: Calling all_plugins_play to load vars for managed_node1 28011 1726882541.63764: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882541.63768: Calling groups_plugins_play to load vars for managed_node1 28011 1726882541.65082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882541.65945: done with get_vars() 28011 1726882541.65963: done getting variables 28011 1726882541.66037: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:35:41 -0400 (0:00:00.088) 0:00:11.211 ****** 28011 1726882541.66059: entering _queue_task() for managed_node1/dnf 28011 1726882541.66279: worker is 1 (out of 1 available) 28011 1726882541.66292: exiting _queue_task() for managed_node1/dnf 28011 1726882541.66306: done queuing things up, now waiting for results queue to drain 28011 1726882541.66307: waiting for pending results... 28011 1726882541.66495: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28011 1726882541.66595: in run() - task 12673a56-9f93-962d-7c65-00000000001b 28011 1726882541.66622: variable 'ansible_search_path' from source: unknown 28011 1726882541.66626: variable 'ansible_search_path' from source: unknown 28011 1726882541.66667: calling self._execute() 28011 1726882541.66734: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882541.66738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882541.66899: variable 'omit' from source: magic vars 28011 1726882541.67108: variable 'ansible_distribution_major_version' from source: facts 28011 1726882541.67132: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882541.67318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882541.69345: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882541.69630: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882541.69658: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882541.69685: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882541.69708: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882541.69763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882541.69787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882541.69807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.69832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882541.69843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882541.69921: variable 'ansible_distribution' from source: facts 28011 1726882541.69925: variable 'ansible_distribution_major_version' from source: facts 28011 1726882541.69936: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28011 1726882541.70010: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882541.70092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882541.70110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882541.70128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.70152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882541.70162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882541.70189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882541.70207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882541.70227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.70251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882541.70261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882541.70288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882541.70307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882541.70325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.70350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882541.70361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882541.70460: variable 'network_connections' from source: task vars 28011 1726882541.70469: variable 'interface' from source: set_fact 28011 1726882541.70519: variable 'interface' from source: set_fact 28011 1726882541.70526: variable 'interface' from source: set_fact 28011 1726882541.70570: variable 'interface' from source: set_fact 28011 1726882541.70622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882541.70731: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882541.70761: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882541.70780: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882541.70804: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882541.70833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882541.70847: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882541.70871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.70891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882541.70932: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882541.71074: variable 'network_connections' from source: task vars 28011 1726882541.71080: variable 'interface' from source: set_fact 28011 1726882541.71152: variable 'interface' from source: set_fact 28011 1726882541.71155: variable 'interface' from source: set_fact 28011 1726882541.71241: variable 'interface' from source: set_fact 28011 1726882541.71244: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28011 1726882541.71246: when evaluation is False, skipping this task 28011 1726882541.71249: _execute() done 28011 1726882541.71251: dumping result to json 28011 1726882541.71253: done dumping result, returning 28011 1726882541.71255: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-962d-7c65-00000000001b] 28011 1726882541.71257: sending task result for task 12673a56-9f93-962d-7c65-00000000001b 28011 1726882541.71419: done sending task result for task 12673a56-9f93-962d-7c65-00000000001b 28011 1726882541.71422: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28011 1726882541.71478: no more pending results, returning what we have 28011 1726882541.71482: results queue empty 28011 1726882541.71483: checking for any_errors_fatal 28011 1726882541.71509: done checking for any_errors_fatal 28011 1726882541.71510: checking for max_fail_percentage 28011 1726882541.71513: done checking for max_fail_percentage 28011 1726882541.71514: checking to see if all hosts have failed and the running result is not ok 28011 1726882541.71515: done checking to see if all hosts have failed 28011 1726882541.71515: getting the remaining hosts for this loop 28011 1726882541.71517: done getting the remaining hosts for this loop 28011 1726882541.71522: getting the next task for host managed_node1 28011 1726882541.71529: done getting next task for host managed_node1 28011 1726882541.71541: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28011 1726882541.71544: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882541.71558: getting variables 28011 1726882541.71560: in VariableManager get_vars() 28011 1726882541.71605: Calling all_inventory to load vars for managed_node1 28011 1726882541.71607: Calling groups_inventory to load vars for managed_node1 28011 1726882541.71609: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882541.71620: Calling all_plugins_play to load vars for managed_node1 28011 1726882541.71622: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882541.71624: Calling groups_plugins_play to load vars for managed_node1 28011 1726882541.72787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882541.74164: done with get_vars() 28011 1726882541.74182: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28011 1726882541.74234: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:35:41 -0400 (0:00:00.081) 0:00:11.293 ****** 28011 1726882541.74255: entering _queue_task() for managed_node1/yum 28011 1726882541.74256: Creating lock for yum 28011 1726882541.74483: worker is 1 (out of 1 available) 28011 1726882541.74499: exiting _queue_task() for managed_node1/yum 28011 1726882541.74512: done queuing things up, now waiting for results queue to drain 28011 1726882541.74514: waiting for pending results... 28011 1726882541.74680: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28011 1726882541.74768: in run() - task 12673a56-9f93-962d-7c65-00000000001c 28011 1726882541.74779: variable 'ansible_search_path' from source: unknown 28011 1726882541.74783: variable 'ansible_search_path' from source: unknown 28011 1726882541.74815: calling self._execute() 28011 1726882541.74882: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882541.74886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882541.74896: variable 'omit' from source: magic vars 28011 1726882541.75163: variable 'ansible_distribution_major_version' from source: facts 28011 1726882541.75176: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882541.75292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882541.77197: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882541.77372: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882541.77376: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882541.77379: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882541.77381: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882541.77443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882541.77470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882541.77498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.77538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882541.77551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882541.77642: variable 'ansible_distribution_major_version' from source: facts 28011 1726882541.77655: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28011 1726882541.77658: when evaluation is False, skipping this task 28011 1726882541.77661: _execute() done 28011 1726882541.77663: dumping result to json 28011 1726882541.77666: done dumping result, returning 28011 1726882541.77675: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-962d-7c65-00000000001c] 28011 1726882541.77678: sending task result for task 12673a56-9f93-962d-7c65-00000000001c 28011 1726882541.77766: done sending task result for task 12673a56-9f93-962d-7c65-00000000001c 28011 1726882541.77769: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28011 1726882541.77846: no more pending results, returning what we have 28011 1726882541.77849: results queue empty 28011 1726882541.77850: checking for any_errors_fatal 28011 1726882541.77857: done checking for any_errors_fatal 28011 1726882541.77857: checking for max_fail_percentage 28011 1726882541.77859: done checking for max_fail_percentage 28011 1726882541.77859: checking to see if all hosts have failed and the running result is not ok 28011 1726882541.77860: done checking to see if all hosts have failed 28011 1726882541.77861: getting the remaining hosts for this loop 28011 1726882541.77862: done getting the remaining hosts for this loop 28011 1726882541.77865: getting the next task for host managed_node1 28011 1726882541.77871: done getting next task for host managed_node1 28011 1726882541.77875: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28011 1726882541.77877: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882541.77890: getting variables 28011 1726882541.77891: in VariableManager get_vars() 28011 1726882541.77930: Calling all_inventory to load vars for managed_node1 28011 1726882541.77933: Calling groups_inventory to load vars for managed_node1 28011 1726882541.77935: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882541.77943: Calling all_plugins_play to load vars for managed_node1 28011 1726882541.77945: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882541.77947: Calling groups_plugins_play to load vars for managed_node1 28011 1726882541.79209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882541.80748: done with get_vars() 28011 1726882541.80770: done getting variables 28011 1726882541.80835: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:35:41 -0400 (0:00:00.066) 0:00:11.359 ****** 28011 1726882541.80868: entering _queue_task() for managed_node1/fail 28011 1726882541.81165: worker is 1 (out of 1 available) 28011 1726882541.81180: exiting _queue_task() for managed_node1/fail 28011 1726882541.81404: done queuing things up, now waiting for results queue to drain 28011 1726882541.81407: waiting for pending results... 28011 1726882541.81615: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28011 1726882541.81622: in run() - task 12673a56-9f93-962d-7c65-00000000001d 28011 1726882541.81632: variable 'ansible_search_path' from source: unknown 28011 1726882541.81636: variable 'ansible_search_path' from source: unknown 28011 1726882541.81675: calling self._execute() 28011 1726882541.81764: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882541.81768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882541.81779: variable 'omit' from source: magic vars 28011 1726882541.82161: variable 'ansible_distribution_major_version' from source: facts 28011 1726882541.82177: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882541.82364: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882541.82489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882541.89645: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882541.89703: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882541.89737: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882541.89765: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882541.89787: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882541.89899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882541.89904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882541.89907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.89943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882541.89958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882541.90078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882541.90082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882541.90084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.90102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882541.90187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882541.90190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882541.90194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882541.90207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.90246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882541.90259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882541.90456: variable 'network_connections' from source: task vars 28011 1726882541.90467: variable 'interface' from source: set_fact 28011 1726882541.90545: variable 'interface' from source: set_fact 28011 1726882541.90554: variable 'interface' from source: set_fact 28011 1726882541.90620: variable 'interface' from source: set_fact 28011 1726882541.90694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882541.90906: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882541.90944: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882541.90974: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882541.91007: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882541.91098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882541.91101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882541.91104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882541.91143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882541.91199: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882541.91448: variable 'network_connections' from source: task vars 28011 1726882541.91453: variable 'interface' from source: set_fact 28011 1726882541.91592: variable 'interface' from source: set_fact 28011 1726882541.91597: variable 'interface' from source: set_fact 28011 1726882541.91599: variable 'interface' from source: set_fact 28011 1726882541.91625: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28011 1726882541.91628: when evaluation is False, skipping this task 28011 1726882541.91630: _execute() done 28011 1726882541.91633: dumping result to json 28011 1726882541.91635: done dumping result, returning 28011 1726882541.91643: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-962d-7c65-00000000001d] 28011 1726882541.91653: sending task result for task 12673a56-9f93-962d-7c65-00000000001d 28011 1726882541.91940: done sending task result for task 12673a56-9f93-962d-7c65-00000000001d 28011 1726882541.91944: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28011 1726882541.91987: no more pending results, returning what we have 28011 1726882541.91995: results queue empty 28011 1726882541.91996: checking for any_errors_fatal 28011 1726882541.92001: done checking for any_errors_fatal 28011 1726882541.92002: checking for max_fail_percentage 28011 1726882541.92004: done checking for max_fail_percentage 28011 1726882541.92005: checking to see if all hosts have failed and the running result is not ok 28011 1726882541.92005: done checking to see if all hosts have failed 28011 1726882541.92006: getting the remaining hosts for this loop 28011 1726882541.92008: done getting the remaining hosts for this loop 28011 1726882541.92011: getting the next task for host managed_node1 28011 1726882541.92018: done getting next task for host managed_node1 28011 1726882541.92022: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28011 1726882541.92024: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882541.92039: getting variables 28011 1726882541.92041: in VariableManager get_vars() 28011 1726882541.92082: Calling all_inventory to load vars for managed_node1 28011 1726882541.92085: Calling groups_inventory to load vars for managed_node1 28011 1726882541.92087: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882541.92101: Calling all_plugins_play to load vars for managed_node1 28011 1726882541.92104: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882541.92107: Calling groups_plugins_play to load vars for managed_node1 28011 1726882541.97042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882541.98621: done with get_vars() 28011 1726882541.98641: done getting variables 28011 1726882541.98686: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:35:41 -0400 (0:00:00.178) 0:00:11.538 ****** 28011 1726882541.98718: entering _queue_task() for managed_node1/package 28011 1726882541.99150: worker is 1 (out of 1 available) 28011 1726882541.99160: exiting _queue_task() for managed_node1/package 28011 1726882541.99171: done queuing things up, now waiting for results queue to drain 28011 1726882541.99173: waiting for pending results... 28011 1726882541.99367: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 28011 1726882541.99509: in run() - task 12673a56-9f93-962d-7c65-00000000001e 28011 1726882541.99513: variable 'ansible_search_path' from source: unknown 28011 1726882541.99571: variable 'ansible_search_path' from source: unknown 28011 1726882541.99575: calling self._execute() 28011 1726882541.99655: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882541.99669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882541.99692: variable 'omit' from source: magic vars 28011 1726882542.00098: variable 'ansible_distribution_major_version' from source: facts 28011 1726882542.00120: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882542.00322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882542.00618: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882542.00768: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882542.00771: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882542.00806: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882542.00932: variable 'network_packages' from source: role '' defaults 28011 1726882542.01048: variable '__network_provider_setup' from source: role '' defaults 28011 1726882542.01063: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882542.01143: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882542.01157: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882542.01229: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882542.01428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882542.03492: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882542.03558: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882542.03699: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882542.03702: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882542.03704: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882542.03757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.03788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.03826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.03870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.03887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.03941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.03967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.03994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.04042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.04059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.04290: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28011 1726882542.04475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.04479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.04481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.04521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.04539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.04642: variable 'ansible_python' from source: facts 28011 1726882542.04673: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28011 1726882542.04773: variable '__network_wpa_supplicant_required' from source: role '' defaults 28011 1726882542.04868: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28011 1726882542.05009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.05099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.05103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.05116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.05140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.05188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.05238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.05268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.05314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.05341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.05483: variable 'network_connections' from source: task vars 28011 1726882542.05545: variable 'interface' from source: set_fact 28011 1726882542.05601: variable 'interface' from source: set_fact 28011 1726882542.05617: variable 'interface' from source: set_fact 28011 1726882542.05719: variable 'interface' from source: set_fact 28011 1726882542.05805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882542.05836: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882542.05872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.05912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882542.05960: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882542.06298: variable 'network_connections' from source: task vars 28011 1726882542.06303: variable 'interface' from source: set_fact 28011 1726882542.06358: variable 'interface' from source: set_fact 28011 1726882542.06373: variable 'interface' from source: set_fact 28011 1726882542.06482: variable 'interface' from source: set_fact 28011 1726882542.06702: variable '__network_packages_default_wireless' from source: role '' defaults 28011 1726882542.06705: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882542.06924: variable 'network_connections' from source: task vars 28011 1726882542.06933: variable 'interface' from source: set_fact 28011 1726882542.06994: variable 'interface' from source: set_fact 28011 1726882542.07006: variable 'interface' from source: set_fact 28011 1726882542.07071: variable 'interface' from source: set_fact 28011 1726882542.07105: variable '__network_packages_default_team' from source: role '' defaults 28011 1726882542.07183: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882542.07464: variable 'network_connections' from source: task vars 28011 1726882542.07474: variable 'interface' from source: set_fact 28011 1726882542.07535: variable 'interface' from source: set_fact 28011 1726882542.07546: variable 'interface' from source: set_fact 28011 1726882542.07615: variable 'interface' from source: set_fact 28011 1726882542.07685: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882542.07743: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882542.07754: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882542.07820: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882542.08037: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28011 1726882542.08541: variable 'network_connections' from source: task vars 28011 1726882542.08558: variable 'interface' from source: set_fact 28011 1726882542.08625: variable 'interface' from source: set_fact 28011 1726882542.08637: variable 'interface' from source: set_fact 28011 1726882542.08709: variable 'interface' from source: set_fact 28011 1726882542.08729: variable 'ansible_distribution' from source: facts 28011 1726882542.08737: variable '__network_rh_distros' from source: role '' defaults 28011 1726882542.08746: variable 'ansible_distribution_major_version' from source: facts 28011 1726882542.08780: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28011 1726882542.08952: variable 'ansible_distribution' from source: facts 28011 1726882542.08964: variable '__network_rh_distros' from source: role '' defaults 28011 1726882542.08976: variable 'ansible_distribution_major_version' from source: facts 28011 1726882542.09006: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28011 1726882542.09170: variable 'ansible_distribution' from source: facts 28011 1726882542.09209: variable '__network_rh_distros' from source: role '' defaults 28011 1726882542.09212: variable 'ansible_distribution_major_version' from source: facts 28011 1726882542.09239: variable 'network_provider' from source: set_fact 28011 1726882542.09262: variable 'ansible_facts' from source: unknown 28011 1726882542.10296: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28011 1726882542.10299: when evaluation is False, skipping this task 28011 1726882542.10301: _execute() done 28011 1726882542.10303: dumping result to json 28011 1726882542.10305: done dumping result, returning 28011 1726882542.10308: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-962d-7c65-00000000001e] 28011 1726882542.10310: sending task result for task 12673a56-9f93-962d-7c65-00000000001e 28011 1726882542.10378: done sending task result for task 12673a56-9f93-962d-7c65-00000000001e 28011 1726882542.10381: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28011 1726882542.10431: no more pending results, returning what we have 28011 1726882542.10435: results queue empty 28011 1726882542.10436: checking for any_errors_fatal 28011 1726882542.10444: done checking for any_errors_fatal 28011 1726882542.10444: checking for max_fail_percentage 28011 1726882542.10446: done checking for max_fail_percentage 28011 1726882542.10447: checking to see if all hosts have failed and the running result is not ok 28011 1726882542.10448: done checking to see if all hosts have failed 28011 1726882542.10448: getting the remaining hosts for this loop 28011 1726882542.10450: done getting the remaining hosts for this loop 28011 1726882542.10453: getting the next task for host managed_node1 28011 1726882542.10460: done getting next task for host managed_node1 28011 1726882542.10464: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28011 1726882542.10466: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882542.10488: getting variables 28011 1726882542.10489: in VariableManager get_vars() 28011 1726882542.10733: Calling all_inventory to load vars for managed_node1 28011 1726882542.10736: Calling groups_inventory to load vars for managed_node1 28011 1726882542.10738: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882542.10748: Calling all_plugins_play to load vars for managed_node1 28011 1726882542.10750: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882542.10753: Calling groups_plugins_play to load vars for managed_node1 28011 1726882542.12153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882542.13969: done with get_vars() 28011 1726882542.13991: done getting variables 28011 1726882542.14056: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:35:42 -0400 (0:00:00.153) 0:00:11.692 ****** 28011 1726882542.14096: entering _queue_task() for managed_node1/package 28011 1726882542.14624: worker is 1 (out of 1 available) 28011 1726882542.14633: exiting _queue_task() for managed_node1/package 28011 1726882542.14643: done queuing things up, now waiting for results queue to drain 28011 1726882542.14644: waiting for pending results... 28011 1726882542.14885: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28011 1726882542.14889: in run() - task 12673a56-9f93-962d-7c65-00000000001f 28011 1726882542.14892: variable 'ansible_search_path' from source: unknown 28011 1726882542.14897: variable 'ansible_search_path' from source: unknown 28011 1726882542.14933: calling self._execute() 28011 1726882542.15024: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882542.15034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882542.15046: variable 'omit' from source: magic vars 28011 1726882542.15436: variable 'ansible_distribution_major_version' from source: facts 28011 1726882542.15453: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882542.15581: variable 'network_state' from source: role '' defaults 28011 1726882542.15601: Evaluated conditional (network_state != {}): False 28011 1726882542.15609: when evaluation is False, skipping this task 28011 1726882542.15622: _execute() done 28011 1726882542.15635: dumping result to json 28011 1726882542.15697: done dumping result, returning 28011 1726882542.15701: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-962d-7c65-00000000001f] 28011 1726882542.15704: sending task result for task 12673a56-9f93-962d-7c65-00000000001f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882542.15890: no more pending results, returning what we have 28011 1726882542.15896: results queue empty 28011 1726882542.15898: checking for any_errors_fatal 28011 1726882542.15908: done checking for any_errors_fatal 28011 1726882542.15909: checking for max_fail_percentage 28011 1726882542.15911: done checking for max_fail_percentage 28011 1726882542.15912: checking to see if all hosts have failed and the running result is not ok 28011 1726882542.15913: done checking to see if all hosts have failed 28011 1726882542.15914: getting the remaining hosts for this loop 28011 1726882542.15916: done getting the remaining hosts for this loop 28011 1726882542.15920: getting the next task for host managed_node1 28011 1726882542.15928: done getting next task for host managed_node1 28011 1726882542.15931: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28011 1726882542.15934: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882542.15957: getting variables 28011 1726882542.15959: in VariableManager get_vars() 28011 1726882542.16182: Calling all_inventory to load vars for managed_node1 28011 1726882542.16185: Calling groups_inventory to load vars for managed_node1 28011 1726882542.16188: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882542.16204: done sending task result for task 12673a56-9f93-962d-7c65-00000000001f 28011 1726882542.16207: WORKER PROCESS EXITING 28011 1726882542.16221: Calling all_plugins_play to load vars for managed_node1 28011 1726882542.16225: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882542.16231: Calling groups_plugins_play to load vars for managed_node1 28011 1726882542.17891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882542.19570: done with get_vars() 28011 1726882542.19595: done getting variables 28011 1726882542.19679: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:35:42 -0400 (0:00:00.056) 0:00:11.748 ****** 28011 1726882542.19722: entering _queue_task() for managed_node1/package 28011 1726882542.20362: worker is 1 (out of 1 available) 28011 1726882542.20374: exiting _queue_task() for managed_node1/package 28011 1726882542.20386: done queuing things up, now waiting for results queue to drain 28011 1726882542.20387: waiting for pending results... 28011 1726882542.21290: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28011 1726882542.21419: in run() - task 12673a56-9f93-962d-7c65-000000000020 28011 1726882542.21433: variable 'ansible_search_path' from source: unknown 28011 1726882542.21437: variable 'ansible_search_path' from source: unknown 28011 1726882542.21589: calling self._execute() 28011 1726882542.21812: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882542.21817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882542.21828: variable 'omit' from source: magic vars 28011 1726882542.22568: variable 'ansible_distribution_major_version' from source: facts 28011 1726882542.22588: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882542.22725: variable 'network_state' from source: role '' defaults 28011 1726882542.22742: Evaluated conditional (network_state != {}): False 28011 1726882542.22751: when evaluation is False, skipping this task 28011 1726882542.22760: _execute() done 28011 1726882542.22769: dumping result to json 28011 1726882542.22778: done dumping result, returning 28011 1726882542.22792: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-962d-7c65-000000000020] 28011 1726882542.22809: sending task result for task 12673a56-9f93-962d-7c65-000000000020 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882542.22958: no more pending results, returning what we have 28011 1726882542.22962: results queue empty 28011 1726882542.22962: checking for any_errors_fatal 28011 1726882542.22971: done checking for any_errors_fatal 28011 1726882542.22972: checking for max_fail_percentage 28011 1726882542.22973: done checking for max_fail_percentage 28011 1726882542.22974: checking to see if all hosts have failed and the running result is not ok 28011 1726882542.22975: done checking to see if all hosts have failed 28011 1726882542.22975: getting the remaining hosts for this loop 28011 1726882542.22977: done getting the remaining hosts for this loop 28011 1726882542.22980: getting the next task for host managed_node1 28011 1726882542.22998: done getting next task for host managed_node1 28011 1726882542.23002: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28011 1726882542.23005: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882542.23019: getting variables 28011 1726882542.23021: in VariableManager get_vars() 28011 1726882542.23057: Calling all_inventory to load vars for managed_node1 28011 1726882542.23059: Calling groups_inventory to load vars for managed_node1 28011 1726882542.23061: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882542.23070: Calling all_plugins_play to load vars for managed_node1 28011 1726882542.23073: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882542.23075: Calling groups_plugins_play to load vars for managed_node1 28011 1726882542.23609: done sending task result for task 12673a56-9f93-962d-7c65-000000000020 28011 1726882542.23612: WORKER PROCESS EXITING 28011 1726882542.25929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882542.29717: done with get_vars() 28011 1726882542.29749: done getting variables 28011 1726882542.29968: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:35:42 -0400 (0:00:00.104) 0:00:11.853 ****** 28011 1726882542.30205: entering _queue_task() for managed_node1/service 28011 1726882542.30207: Creating lock for service 28011 1726882542.30787: worker is 1 (out of 1 available) 28011 1726882542.30811: exiting _queue_task() for managed_node1/service 28011 1726882542.30823: done queuing things up, now waiting for results queue to drain 28011 1726882542.30825: waiting for pending results... 28011 1726882542.31097: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28011 1726882542.31237: in run() - task 12673a56-9f93-962d-7c65-000000000021 28011 1726882542.31258: variable 'ansible_search_path' from source: unknown 28011 1726882542.31267: variable 'ansible_search_path' from source: unknown 28011 1726882542.31319: calling self._execute() 28011 1726882542.31415: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882542.31425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882542.31438: variable 'omit' from source: magic vars 28011 1726882542.31833: variable 'ansible_distribution_major_version' from source: facts 28011 1726882542.31849: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882542.31975: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882542.32185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882542.34948: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882542.35030: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882542.35066: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882542.35102: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882542.35136: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882542.35213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.35247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.35272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.35315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.35328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.35377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.35403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.35426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.35467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.35480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.35522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.35544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.35571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.35612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.35627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.35808: variable 'network_connections' from source: task vars 28011 1726882542.35823: variable 'interface' from source: set_fact 28011 1726882542.35899: variable 'interface' from source: set_fact 28011 1726882542.35931: variable 'interface' from source: set_fact 28011 1726882542.35967: variable 'interface' from source: set_fact 28011 1726882542.36050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882542.36257: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882542.36260: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882542.36295: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882542.36329: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882542.36368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882542.36475: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882542.36478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.36480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882542.36510: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882542.36753: variable 'network_connections' from source: task vars 28011 1726882542.36765: variable 'interface' from source: set_fact 28011 1726882542.36830: variable 'interface' from source: set_fact 28011 1726882542.36837: variable 'interface' from source: set_fact 28011 1726882542.36907: variable 'interface' from source: set_fact 28011 1726882542.36939: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28011 1726882542.36943: when evaluation is False, skipping this task 28011 1726882542.36945: _execute() done 28011 1726882542.36948: dumping result to json 28011 1726882542.36950: done dumping result, returning 28011 1726882542.36959: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-962d-7c65-000000000021] 28011 1726882542.36969: sending task result for task 12673a56-9f93-962d-7c65-000000000021 28011 1726882542.37248: done sending task result for task 12673a56-9f93-962d-7c65-000000000021 28011 1726882542.37250: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28011 1726882542.37284: no more pending results, returning what we have 28011 1726882542.37287: results queue empty 28011 1726882542.37288: checking for any_errors_fatal 28011 1726882542.37292: done checking for any_errors_fatal 28011 1726882542.37295: checking for max_fail_percentage 28011 1726882542.37296: done checking for max_fail_percentage 28011 1726882542.37297: checking to see if all hosts have failed and the running result is not ok 28011 1726882542.37298: done checking to see if all hosts have failed 28011 1726882542.37298: getting the remaining hosts for this loop 28011 1726882542.37299: done getting the remaining hosts for this loop 28011 1726882542.37303: getting the next task for host managed_node1 28011 1726882542.37308: done getting next task for host managed_node1 28011 1726882542.37312: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28011 1726882542.37314: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882542.37328: getting variables 28011 1726882542.37330: in VariableManager get_vars() 28011 1726882542.37367: Calling all_inventory to load vars for managed_node1 28011 1726882542.37370: Calling groups_inventory to load vars for managed_node1 28011 1726882542.37372: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882542.37381: Calling all_plugins_play to load vars for managed_node1 28011 1726882542.37384: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882542.37386: Calling groups_plugins_play to load vars for managed_node1 28011 1726882542.38698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882542.40188: done with get_vars() 28011 1726882542.40213: done getting variables 28011 1726882542.40264: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:35:42 -0400 (0:00:00.100) 0:00:11.954 ****** 28011 1726882542.40294: entering _queue_task() for managed_node1/service 28011 1726882542.40591: worker is 1 (out of 1 available) 28011 1726882542.40607: exiting _queue_task() for managed_node1/service 28011 1726882542.40619: done queuing things up, now waiting for results queue to drain 28011 1726882542.40620: waiting for pending results... 28011 1726882542.41011: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28011 1726882542.41016: in run() - task 12673a56-9f93-962d-7c65-000000000022 28011 1726882542.41031: variable 'ansible_search_path' from source: unknown 28011 1726882542.41038: variable 'ansible_search_path' from source: unknown 28011 1726882542.41077: calling self._execute() 28011 1726882542.41173: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882542.41185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882542.41203: variable 'omit' from source: magic vars 28011 1726882542.41583: variable 'ansible_distribution_major_version' from source: facts 28011 1726882542.41603: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882542.41769: variable 'network_provider' from source: set_fact 28011 1726882542.41779: variable 'network_state' from source: role '' defaults 28011 1726882542.41792: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28011 1726882542.41805: variable 'omit' from source: magic vars 28011 1726882542.41858: variable 'omit' from source: magic vars 28011 1726882542.41895: variable 'network_service_name' from source: role '' defaults 28011 1726882542.41971: variable 'network_service_name' from source: role '' defaults 28011 1726882542.42087: variable '__network_provider_setup' from source: role '' defaults 28011 1726882542.42100: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882542.42163: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882542.42177: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882542.42245: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882542.42469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882542.44900: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882542.44903: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882542.44950: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882542.44989: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882542.45027: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882542.45109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.45200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.45203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.45231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.45251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.45304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.45327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.45356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.45391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.45411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.45622: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28011 1726882542.45771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.45774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.45798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.45841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.45862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.45958: variable 'ansible_python' from source: facts 28011 1726882542.45988: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28011 1726882542.46098: variable '__network_wpa_supplicant_required' from source: role '' defaults 28011 1726882542.46155: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28011 1726882542.46283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.46318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.46598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.46601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.46603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.46605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882542.46615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882542.46618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.46620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882542.46622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882542.46691: variable 'network_connections' from source: task vars 28011 1726882542.46708: variable 'interface' from source: set_fact 28011 1726882542.46798: variable 'interface' from source: set_fact 28011 1726882542.46818: variable 'interface' from source: set_fact 28011 1726882542.46903: variable 'interface' from source: set_fact 28011 1726882542.47040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882542.47238: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882542.47295: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882542.47339: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882542.47384: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882542.47452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882542.47484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882542.47524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882542.47559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882542.47618: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882542.47895: variable 'network_connections' from source: task vars 28011 1726882542.47907: variable 'interface' from source: set_fact 28011 1726882542.47984: variable 'interface' from source: set_fact 28011 1726882542.48002: variable 'interface' from source: set_fact 28011 1726882542.48079: variable 'interface' from source: set_fact 28011 1726882542.48164: variable '__network_packages_default_wireless' from source: role '' defaults 28011 1726882542.48245: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882542.48542: variable 'network_connections' from source: task vars 28011 1726882542.48552: variable 'interface' from source: set_fact 28011 1726882542.48630: variable 'interface' from source: set_fact 28011 1726882542.48689: variable 'interface' from source: set_fact 28011 1726882542.48720: variable 'interface' from source: set_fact 28011 1726882542.48753: variable '__network_packages_default_team' from source: role '' defaults 28011 1726882542.48837: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882542.49103: variable 'network_connections' from source: task vars 28011 1726882542.49114: variable 'interface' from source: set_fact 28011 1726882542.49191: variable 'interface' from source: set_fact 28011 1726882542.49207: variable 'interface' from source: set_fact 28011 1726882542.49343: variable 'interface' from source: set_fact 28011 1726882542.49363: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882542.49428: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882542.49440: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882542.49506: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882542.49726: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28011 1726882542.50245: variable 'network_connections' from source: task vars 28011 1726882542.50256: variable 'interface' from source: set_fact 28011 1726882542.50323: variable 'interface' from source: set_fact 28011 1726882542.50334: variable 'interface' from source: set_fact 28011 1726882542.50398: variable 'interface' from source: set_fact 28011 1726882542.50498: variable 'ansible_distribution' from source: facts 28011 1726882542.50501: variable '__network_rh_distros' from source: role '' defaults 28011 1726882542.50502: variable 'ansible_distribution_major_version' from source: facts 28011 1726882542.50504: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28011 1726882542.50624: variable 'ansible_distribution' from source: facts 28011 1726882542.50633: variable '__network_rh_distros' from source: role '' defaults 28011 1726882542.50649: variable 'ansible_distribution_major_version' from source: facts 28011 1726882542.50667: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28011 1726882542.50828: variable 'ansible_distribution' from source: facts 28011 1726882542.50838: variable '__network_rh_distros' from source: role '' defaults 28011 1726882542.50847: variable 'ansible_distribution_major_version' from source: facts 28011 1726882542.50890: variable 'network_provider' from source: set_fact 28011 1726882542.50918: variable 'omit' from source: magic vars 28011 1726882542.50948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882542.50984: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882542.51011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882542.51032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882542.51077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882542.51081: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882542.51089: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882542.51099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882542.51200: Set connection var ansible_connection to ssh 28011 1726882542.51213: Set connection var ansible_pipelining to False 28011 1726882542.51295: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882542.51299: Set connection var ansible_shell_executable to /bin/sh 28011 1726882542.51301: Set connection var ansible_timeout to 10 28011 1726882542.51303: Set connection var ansible_shell_type to sh 28011 1726882542.51305: variable 'ansible_shell_executable' from source: unknown 28011 1726882542.51307: variable 'ansible_connection' from source: unknown 28011 1726882542.51309: variable 'ansible_module_compression' from source: unknown 28011 1726882542.51311: variable 'ansible_shell_type' from source: unknown 28011 1726882542.51312: variable 'ansible_shell_executable' from source: unknown 28011 1726882542.51314: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882542.51320: variable 'ansible_pipelining' from source: unknown 28011 1726882542.51322: variable 'ansible_timeout' from source: unknown 28011 1726882542.51324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882542.51423: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882542.51524: variable 'omit' from source: magic vars 28011 1726882542.51534: starting attempt loop 28011 1726882542.51540: running the handler 28011 1726882542.51614: variable 'ansible_facts' from source: unknown 28011 1726882542.52836: _low_level_execute_command(): starting 28011 1726882542.52964: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882542.53727: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882542.53813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882542.53821: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882542.53888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882542.55870: stdout chunk (state=3): >>>/root <<< 28011 1726882542.55874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882542.55876: stdout chunk (state=3): >>><<< 28011 1726882542.55878: stderr chunk (state=3): >>><<< 28011 1726882542.55880: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882542.55882: _low_level_execute_command(): starting 28011 1726882542.55885: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712 `" && echo ansible-tmp-1726882542.5578883-28612-267912319683712="` echo /root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712 `" ) && sleep 0' 28011 1726882542.57013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882542.57111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882542.57131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882542.57200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882542.59067: stdout chunk (state=3): >>>ansible-tmp-1726882542.5578883-28612-267912319683712=/root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712 <<< 28011 1726882542.59178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882542.59300: stderr chunk (state=3): >>><<< 28011 1726882542.59310: stdout chunk (state=3): >>><<< 28011 1726882542.59332: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882542.5578883-28612-267912319683712=/root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882542.59380: variable 'ansible_module_compression' from source: unknown 28011 1726882542.59699: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 28011 1726882542.59703: ANSIBALLZ: Acquiring lock 28011 1726882542.59705: ANSIBALLZ: Lock acquired: 139767565767152 28011 1726882542.59706: ANSIBALLZ: Creating module 28011 1726882542.92388: ANSIBALLZ: Writing module into payload 28011 1726882542.92556: ANSIBALLZ: Writing module 28011 1726882542.92583: ANSIBALLZ: Renaming module 28011 1726882542.92599: ANSIBALLZ: Done creating module 28011 1726882542.92642: variable 'ansible_facts' from source: unknown 28011 1726882542.92870: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712/AnsiballZ_systemd.py 28011 1726882542.93014: Sending initial data 28011 1726882542.93022: Sent initial data (156 bytes) 28011 1726882542.93716: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882542.93738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882542.93783: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882542.93865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882542.93875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882542.93905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882542.94086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882542.95622: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882542.95687: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882542.95738: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpunm12jwr /root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712/AnsiballZ_systemd.py <<< 28011 1726882542.95745: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712/AnsiballZ_systemd.py" <<< 28011 1726882542.95779: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpunm12jwr" to remote "/root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712/AnsiballZ_systemd.py" <<< 28011 1726882542.97320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882542.97426: stderr chunk (state=3): >>><<< 28011 1726882542.97429: stdout chunk (state=3): >>><<< 28011 1726882542.97431: done transferring module to remote 28011 1726882542.97433: _low_level_execute_command(): starting 28011 1726882542.97436: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712/ /root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712/AnsiballZ_systemd.py && sleep 0' 28011 1726882542.98187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882542.98251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882542.98282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882542.98306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882542.98397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882543.00185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882543.00188: stdout chunk (state=3): >>><<< 28011 1726882543.00195: stderr chunk (state=3): >>><<< 28011 1726882543.00295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882543.00299: _low_level_execute_command(): starting 28011 1726882543.00303: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712/AnsiballZ_systemd.py && sleep 0' 28011 1726882543.00942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882543.00976: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882543.01003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882543.01024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882543.01049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882543.01076: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882543.01186: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882543.01215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882543.01316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882543.30137: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10833920", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3302117376", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1589966000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28011 1726882543.32155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882543.32159: stdout chunk (state=3): >>><<< 28011 1726882543.32161: stderr chunk (state=3): >>><<< 28011 1726882543.32164: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10833920", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3302117376", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1589966000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882543.32545: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882543.32573: _low_level_execute_command(): starting 28011 1726882543.32778: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882542.5578883-28612-267912319683712/ > /dev/null 2>&1 && sleep 0' 28011 1726882543.33909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882543.33923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882543.34000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882543.34006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882543.34221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882543.36005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882543.36051: stderr chunk (state=3): >>><<< 28011 1726882543.36054: stdout chunk (state=3): >>><<< 28011 1726882543.36075: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882543.36082: handler run complete 28011 1726882543.36157: attempt loop complete, returning result 28011 1726882543.36160: _execute() done 28011 1726882543.36163: dumping result to json 28011 1726882543.36182: done dumping result, returning 28011 1726882543.36196: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-962d-7c65-000000000022] 28011 1726882543.36199: sending task result for task 12673a56-9f93-962d-7c65-000000000022 28011 1726882543.36639: done sending task result for task 12673a56-9f93-962d-7c65-000000000022 28011 1726882543.36643: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882543.36714: no more pending results, returning what we have 28011 1726882543.36717: results queue empty 28011 1726882543.36718: checking for any_errors_fatal 28011 1726882543.36725: done checking for any_errors_fatal 28011 1726882543.36726: checking for max_fail_percentage 28011 1726882543.36728: done checking for max_fail_percentage 28011 1726882543.36728: checking to see if all hosts have failed and the running result is not ok 28011 1726882543.36729: done checking to see if all hosts have failed 28011 1726882543.36730: getting the remaining hosts for this loop 28011 1726882543.36731: done getting the remaining hosts for this loop 28011 1726882543.36735: getting the next task for host managed_node1 28011 1726882543.36741: done getting next task for host managed_node1 28011 1726882543.36745: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28011 1726882543.36747: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882543.36758: getting variables 28011 1726882543.36760: in VariableManager get_vars() 28011 1726882543.36802: Calling all_inventory to load vars for managed_node1 28011 1726882543.36805: Calling groups_inventory to load vars for managed_node1 28011 1726882543.36807: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882543.36818: Calling all_plugins_play to load vars for managed_node1 28011 1726882543.36821: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882543.36824: Calling groups_plugins_play to load vars for managed_node1 28011 1726882543.38539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882543.39934: done with get_vars() 28011 1726882543.39959: done getting variables 28011 1726882543.40012: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:35:43 -0400 (0:00:00.997) 0:00:12.951 ****** 28011 1726882543.40035: entering _queue_task() for managed_node1/service 28011 1726882543.40331: worker is 1 (out of 1 available) 28011 1726882543.40343: exiting _queue_task() for managed_node1/service 28011 1726882543.40355: done queuing things up, now waiting for results queue to drain 28011 1726882543.40357: waiting for pending results... 28011 1726882543.40721: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28011 1726882543.40761: in run() - task 12673a56-9f93-962d-7c65-000000000023 28011 1726882543.40818: variable 'ansible_search_path' from source: unknown 28011 1726882543.40822: variable 'ansible_search_path' from source: unknown 28011 1726882543.40830: calling self._execute() 28011 1726882543.40903: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882543.40909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882543.40924: variable 'omit' from source: magic vars 28011 1726882543.41311: variable 'ansible_distribution_major_version' from source: facts 28011 1726882543.41315: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882543.41406: variable 'network_provider' from source: set_fact 28011 1726882543.41409: Evaluated conditional (network_provider == "nm"): True 28011 1726882543.41478: variable '__network_wpa_supplicant_required' from source: role '' defaults 28011 1726882543.41542: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28011 1726882543.41655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882543.44782: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882543.44798: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882543.44843: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882543.44930: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882543.45028: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882543.45238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882543.45272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882543.45352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882543.45501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882543.45505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882543.45760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882543.45763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882543.45765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882543.45768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882543.45770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882543.45907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882543.45943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882543.46051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882543.46097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882543.46150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882543.46469: variable 'network_connections' from source: task vars 28011 1726882543.46487: variable 'interface' from source: set_fact 28011 1726882543.46664: variable 'interface' from source: set_fact 28011 1726882543.46679: variable 'interface' from source: set_fact 28011 1726882543.46952: variable 'interface' from source: set_fact 28011 1726882543.46955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882543.47341: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882543.47429: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882543.47530: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882543.47563: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882543.47648: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882543.47738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882543.47766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882543.47850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882543.47903: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882543.48173: variable 'network_connections' from source: task vars 28011 1726882543.48183: variable 'interface' from source: set_fact 28011 1726882543.48249: variable 'interface' from source: set_fact 28011 1726882543.48266: variable 'interface' from source: set_fact 28011 1726882543.48329: variable 'interface' from source: set_fact 28011 1726882543.48381: Evaluated conditional (__network_wpa_supplicant_required): False 28011 1726882543.48388: when evaluation is False, skipping this task 28011 1726882543.48400: _execute() done 28011 1726882543.48418: dumping result to json 28011 1726882543.48426: done dumping result, returning 28011 1726882543.48438: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-962d-7c65-000000000023] 28011 1726882543.48474: sending task result for task 12673a56-9f93-962d-7c65-000000000023 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28011 1726882543.48594: no more pending results, returning what we have 28011 1726882543.48598: results queue empty 28011 1726882543.48599: checking for any_errors_fatal 28011 1726882543.48619: done checking for any_errors_fatal 28011 1726882543.48619: checking for max_fail_percentage 28011 1726882543.48621: done checking for max_fail_percentage 28011 1726882543.48622: checking to see if all hosts have failed and the running result is not ok 28011 1726882543.48623: done checking to see if all hosts have failed 28011 1726882543.48623: getting the remaining hosts for this loop 28011 1726882543.48625: done getting the remaining hosts for this loop 28011 1726882543.48628: getting the next task for host managed_node1 28011 1726882543.48634: done getting next task for host managed_node1 28011 1726882543.48638: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28011 1726882543.48641: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882543.48654: getting variables 28011 1726882543.48656: in VariableManager get_vars() 28011 1726882543.48698: Calling all_inventory to load vars for managed_node1 28011 1726882543.48701: Calling groups_inventory to load vars for managed_node1 28011 1726882543.48703: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882543.48714: Calling all_plugins_play to load vars for managed_node1 28011 1726882543.48717: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882543.48720: Calling groups_plugins_play to load vars for managed_node1 28011 1726882543.50100: done sending task result for task 12673a56-9f93-962d-7c65-000000000023 28011 1726882543.50106: WORKER PROCESS EXITING 28011 1726882543.51423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882543.54936: done with get_vars() 28011 1726882543.54958: done getting variables 28011 1726882543.55018: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:35:43 -0400 (0:00:00.150) 0:00:13.101 ****** 28011 1726882543.55048: entering _queue_task() for managed_node1/service 28011 1726882543.55689: worker is 1 (out of 1 available) 28011 1726882543.55704: exiting _queue_task() for managed_node1/service 28011 1726882543.55717: done queuing things up, now waiting for results queue to drain 28011 1726882543.55719: waiting for pending results... 28011 1726882543.56313: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 28011 1726882543.56794: in run() - task 12673a56-9f93-962d-7c65-000000000024 28011 1726882543.56798: variable 'ansible_search_path' from source: unknown 28011 1726882543.56801: variable 'ansible_search_path' from source: unknown 28011 1726882543.56805: calling self._execute() 28011 1726882543.56862: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882543.57116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882543.57120: variable 'omit' from source: magic vars 28011 1726882543.57796: variable 'ansible_distribution_major_version' from source: facts 28011 1726882543.57816: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882543.58109: variable 'network_provider' from source: set_fact 28011 1726882543.58122: Evaluated conditional (network_provider == "initscripts"): False 28011 1726882543.58130: when evaluation is False, skipping this task 28011 1726882543.58138: _execute() done 28011 1726882543.58147: dumping result to json 28011 1726882543.58155: done dumping result, returning 28011 1726882543.58169: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-962d-7c65-000000000024] 28011 1726882543.58180: sending task result for task 12673a56-9f93-962d-7c65-000000000024 28011 1726882543.58500: done sending task result for task 12673a56-9f93-962d-7c65-000000000024 28011 1726882543.58504: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882543.58550: no more pending results, returning what we have 28011 1726882543.58555: results queue empty 28011 1726882543.58556: checking for any_errors_fatal 28011 1726882543.58566: done checking for any_errors_fatal 28011 1726882543.58566: checking for max_fail_percentage 28011 1726882543.58568: done checking for max_fail_percentage 28011 1726882543.58569: checking to see if all hosts have failed and the running result is not ok 28011 1726882543.58570: done checking to see if all hosts have failed 28011 1726882543.58570: getting the remaining hosts for this loop 28011 1726882543.58572: done getting the remaining hosts for this loop 28011 1726882543.58576: getting the next task for host managed_node1 28011 1726882543.58582: done getting next task for host managed_node1 28011 1726882543.58586: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28011 1726882543.58589: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882543.58607: getting variables 28011 1726882543.58609: in VariableManager get_vars() 28011 1726882543.58652: Calling all_inventory to load vars for managed_node1 28011 1726882543.58655: Calling groups_inventory to load vars for managed_node1 28011 1726882543.58657: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882543.58669: Calling all_plugins_play to load vars for managed_node1 28011 1726882543.58672: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882543.58675: Calling groups_plugins_play to load vars for managed_node1 28011 1726882543.61210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882543.64147: done with get_vars() 28011 1726882543.64181: done getting variables 28011 1726882543.64446: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:35:43 -0400 (0:00:00.094) 0:00:13.196 ****** 28011 1726882543.64482: entering _queue_task() for managed_node1/copy 28011 1726882543.65232: worker is 1 (out of 1 available) 28011 1726882543.65241: exiting _queue_task() for managed_node1/copy 28011 1726882543.65252: done queuing things up, now waiting for results queue to drain 28011 1726882543.65253: waiting for pending results... 28011 1726882543.65612: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28011 1726882543.65928: in run() - task 12673a56-9f93-962d-7c65-000000000025 28011 1726882543.65954: variable 'ansible_search_path' from source: unknown 28011 1726882543.65963: variable 'ansible_search_path' from source: unknown 28011 1726882543.66011: calling self._execute() 28011 1726882543.66389: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882543.66397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882543.66400: variable 'omit' from source: magic vars 28011 1726882543.67128: variable 'ansible_distribution_major_version' from source: facts 28011 1726882543.67147: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882543.67387: variable 'network_provider' from source: set_fact 28011 1726882543.67405: Evaluated conditional (network_provider == "initscripts"): False 28011 1726882543.67413: when evaluation is False, skipping this task 28011 1726882543.67429: _execute() done 28011 1726882543.67535: dumping result to json 28011 1726882543.67538: done dumping result, returning 28011 1726882543.67542: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-962d-7c65-000000000025] 28011 1726882543.67545: sending task result for task 12673a56-9f93-962d-7c65-000000000025 28011 1726882543.67709: done sending task result for task 12673a56-9f93-962d-7c65-000000000025 28011 1726882543.67713: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28011 1726882543.67766: no more pending results, returning what we have 28011 1726882543.67770: results queue empty 28011 1726882543.67771: checking for any_errors_fatal 28011 1726882543.67779: done checking for any_errors_fatal 28011 1726882543.67779: checking for max_fail_percentage 28011 1726882543.67781: done checking for max_fail_percentage 28011 1726882543.67782: checking to see if all hosts have failed and the running result is not ok 28011 1726882543.67783: done checking to see if all hosts have failed 28011 1726882543.67783: getting the remaining hosts for this loop 28011 1726882543.67785: done getting the remaining hosts for this loop 28011 1726882543.67789: getting the next task for host managed_node1 28011 1726882543.67797: done getting next task for host managed_node1 28011 1726882543.67801: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28011 1726882543.67804: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882543.67821: getting variables 28011 1726882543.67823: in VariableManager get_vars() 28011 1726882543.67868: Calling all_inventory to load vars for managed_node1 28011 1726882543.67870: Calling groups_inventory to load vars for managed_node1 28011 1726882543.67872: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882543.67883: Calling all_plugins_play to load vars for managed_node1 28011 1726882543.67885: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882543.67888: Calling groups_plugins_play to load vars for managed_node1 28011 1726882543.70941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882543.73513: done with get_vars() 28011 1726882543.73541: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:35:43 -0400 (0:00:00.094) 0:00:13.290 ****** 28011 1726882543.73891: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 28011 1726882543.73896: Creating lock for fedora.linux_system_roles.network_connections 28011 1726882543.74425: worker is 1 (out of 1 available) 28011 1726882543.74439: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 28011 1726882543.74450: done queuing things up, now waiting for results queue to drain 28011 1726882543.74452: waiting for pending results... 28011 1726882543.75110: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28011 1726882543.75252: in run() - task 12673a56-9f93-962d-7c65-000000000026 28011 1726882543.75453: variable 'ansible_search_path' from source: unknown 28011 1726882543.75457: variable 'ansible_search_path' from source: unknown 28011 1726882543.75460: calling self._execute() 28011 1726882543.75609: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882543.75680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882543.75702: variable 'omit' from source: magic vars 28011 1726882543.76402: variable 'ansible_distribution_major_version' from source: facts 28011 1726882543.76537: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882543.76553: variable 'omit' from source: magic vars 28011 1726882543.76617: variable 'omit' from source: magic vars 28011 1726882543.76786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882543.79934: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882543.80014: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882543.80060: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882543.80169: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882543.80172: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882543.80223: variable 'network_provider' from source: set_fact 28011 1726882543.80398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882543.80448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882543.80479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882543.80535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882543.80800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882543.80804: variable 'omit' from source: magic vars 28011 1726882543.81026: variable 'omit' from source: magic vars 28011 1726882543.81162: variable 'network_connections' from source: task vars 28011 1726882543.81179: variable 'interface' from source: set_fact 28011 1726882543.81256: variable 'interface' from source: set_fact 28011 1726882543.81270: variable 'interface' from source: set_fact 28011 1726882543.81335: variable 'interface' from source: set_fact 28011 1726882543.81553: variable 'omit' from source: magic vars 28011 1726882543.81572: variable '__lsr_ansible_managed' from source: task vars 28011 1726882543.81642: variable '__lsr_ansible_managed' from source: task vars 28011 1726882543.82041: Loaded config def from plugin (lookup/template) 28011 1726882543.82215: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28011 1726882543.82219: File lookup term: get_ansible_managed.j2 28011 1726882543.82221: variable 'ansible_search_path' from source: unknown 28011 1726882543.82224: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28011 1726882543.82227: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28011 1726882543.82231: variable 'ansible_search_path' from source: unknown 28011 1726882543.88296: variable 'ansible_managed' from source: unknown 28011 1726882543.88368: variable 'omit' from source: magic vars 28011 1726882543.88389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882543.88415: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882543.88431: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882543.88445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882543.88453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882543.88474: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882543.88477: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882543.88480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882543.88547: Set connection var ansible_connection to ssh 28011 1726882543.88550: Set connection var ansible_pipelining to False 28011 1726882543.88557: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882543.88562: Set connection var ansible_shell_executable to /bin/sh 28011 1726882543.88569: Set connection var ansible_timeout to 10 28011 1726882543.88574: Set connection var ansible_shell_type to sh 28011 1726882543.88591: variable 'ansible_shell_executable' from source: unknown 28011 1726882543.88599: variable 'ansible_connection' from source: unknown 28011 1726882543.88601: variable 'ansible_module_compression' from source: unknown 28011 1726882543.88604: variable 'ansible_shell_type' from source: unknown 28011 1726882543.88606: variable 'ansible_shell_executable' from source: unknown 28011 1726882543.88608: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882543.88612: variable 'ansible_pipelining' from source: unknown 28011 1726882543.88614: variable 'ansible_timeout' from source: unknown 28011 1726882543.88619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882543.88711: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882543.88726: variable 'omit' from source: magic vars 28011 1726882543.88730: starting attempt loop 28011 1726882543.88733: running the handler 28011 1726882543.88742: _low_level_execute_command(): starting 28011 1726882543.88749: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882543.89422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882543.89429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882543.89433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882543.89442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882543.89544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882543.91252: stdout chunk (state=3): >>>/root <<< 28011 1726882543.91339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882543.91374: stderr chunk (state=3): >>><<< 28011 1726882543.91379: stdout chunk (state=3): >>><<< 28011 1726882543.91613: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882543.91617: _low_level_execute_command(): starting 28011 1726882543.91621: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836 `" && echo ansible-tmp-1726882543.9140375-28658-203949485516836="` echo /root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836 `" ) && sleep 0' 28011 1726882543.91945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882543.91950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882543.91963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882543.91974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882543.91987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882543.91997: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882543.92005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882543.92020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882543.92028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882543.92096: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28011 1726882543.92099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882543.92101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882543.92103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882543.92105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882543.92106: stderr chunk (state=3): >>>debug2: match found <<< 28011 1726882543.92108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882543.92147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882543.92182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882543.92185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882543.92243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882543.94111: stdout chunk (state=3): >>>ansible-tmp-1726882543.9140375-28658-203949485516836=/root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836 <<< 28011 1726882543.94301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882543.94304: stdout chunk (state=3): >>><<< 28011 1726882543.94306: stderr chunk (state=3): >>><<< 28011 1726882543.94309: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882543.9140375-28658-203949485516836=/root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882543.94317: variable 'ansible_module_compression' from source: unknown 28011 1726882543.94365: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 28011 1726882543.94368: ANSIBALLZ: Acquiring lock 28011 1726882543.94370: ANSIBALLZ: Lock acquired: 139767561968752 28011 1726882543.94373: ANSIBALLZ: Creating module 28011 1726882544.07721: ANSIBALLZ: Writing module into payload 28011 1726882544.07941: ANSIBALLZ: Writing module 28011 1726882544.07959: ANSIBALLZ: Renaming module 28011 1726882544.07965: ANSIBALLZ: Done creating module 28011 1726882544.07985: variable 'ansible_facts' from source: unknown 28011 1726882544.08055: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836/AnsiballZ_network_connections.py 28011 1726882544.08161: Sending initial data 28011 1726882544.08164: Sent initial data (168 bytes) 28011 1726882544.08806: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882544.08864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882544.08867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882544.08873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882544.08916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882544.10476: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28011 1726882544.10480: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882544.10512: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882544.10560: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpg8zzqypn /root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836/AnsiballZ_network_connections.py <<< 28011 1726882544.10564: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836/AnsiballZ_network_connections.py" <<< 28011 1726882544.10602: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpg8zzqypn" to remote "/root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836/AnsiballZ_network_connections.py" <<< 28011 1726882544.10605: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836/AnsiballZ_network_connections.py" <<< 28011 1726882544.11312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882544.11359: stderr chunk (state=3): >>><<< 28011 1726882544.11362: stdout chunk (state=3): >>><<< 28011 1726882544.11409: done transferring module to remote 28011 1726882544.11418: _low_level_execute_command(): starting 28011 1726882544.11423: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836/ /root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836/AnsiballZ_network_connections.py && sleep 0' 28011 1726882544.11875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882544.11878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882544.11880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882544.11883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882544.11884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882544.11930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882544.11933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882544.11987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882544.13708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882544.13736: stderr chunk (state=3): >>><<< 28011 1726882544.13739: stdout chunk (state=3): >>><<< 28011 1726882544.13753: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882544.13756: _low_level_execute_command(): starting 28011 1726882544.13761: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836/AnsiballZ_network_connections.py && sleep 0' 28011 1726882544.14209: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882544.14212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882544.14218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882544.14220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882544.14223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882544.14270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882544.14273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882544.14277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882544.14323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882544.58299: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28011 1726882544.60219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882544.60223: stdout chunk (state=3): >>><<< 28011 1726882544.60230: stderr chunk (state=3): >>><<< 28011 1726882544.60251: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882544.60619: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26'], 'route': [{'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 30400}, {'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 30200}, {'network': '192.0.2.64', 'prefix': 26, 'gateway': '198.51.100.8', 'metric': 50, 'table': 30200, 'src': '198.51.100.3'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882544.60622: _low_level_execute_command(): starting 28011 1726882544.60624: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882543.9140375-28658-203949485516836/ > /dev/null 2>&1 && sleep 0' 28011 1726882544.61925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882544.62128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882544.62187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882544.64035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882544.64089: stderr chunk (state=3): >>><<< 28011 1726882544.64108: stdout chunk (state=3): >>><<< 28011 1726882544.64168: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882544.64171: handler run complete 28011 1726882544.64214: attempt loop complete, returning result 28011 1726882544.64217: _execute() done 28011 1726882544.64220: dumping result to json 28011 1726882544.64229: done dumping result, returning 28011 1726882544.64241: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-962d-7c65-000000000026] 28011 1726882544.64244: sending task result for task 12673a56-9f93-962d-7c65-000000000026 28011 1726882544.64775: done sending task result for task 12673a56-9f93-962d-7c65-000000000026 28011 1726882544.64778: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": 30200 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 (not-active) 28011 1726882544.64954: no more pending results, returning what we have 28011 1726882544.64958: results queue empty 28011 1726882544.64959: checking for any_errors_fatal 28011 1726882544.64966: done checking for any_errors_fatal 28011 1726882544.64967: checking for max_fail_percentage 28011 1726882544.64969: done checking for max_fail_percentage 28011 1726882544.64970: checking to see if all hosts have failed and the running result is not ok 28011 1726882544.64970: done checking to see if all hosts have failed 28011 1726882544.64971: getting the remaining hosts for this loop 28011 1726882544.64973: done getting the remaining hosts for this loop 28011 1726882544.64976: getting the next task for host managed_node1 28011 1726882544.64982: done getting next task for host managed_node1 28011 1726882544.64986: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28011 1726882544.64989: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882544.65467: getting variables 28011 1726882544.65470: in VariableManager get_vars() 28011 1726882544.65515: Calling all_inventory to load vars for managed_node1 28011 1726882544.65518: Calling groups_inventory to load vars for managed_node1 28011 1726882544.65520: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882544.65532: Calling all_plugins_play to load vars for managed_node1 28011 1726882544.65537: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882544.65541: Calling groups_plugins_play to load vars for managed_node1 28011 1726882544.67999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882544.70327: done with get_vars() 28011 1726882544.70353: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:35:44 -0400 (0:00:00.965) 0:00:14.255 ****** 28011 1726882544.70465: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 28011 1726882544.70547: Creating lock for fedora.linux_system_roles.network_state 28011 1726882544.70963: worker is 1 (out of 1 available) 28011 1726882544.70976: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 28011 1726882544.70988: done queuing things up, now waiting for results queue to drain 28011 1726882544.70992: waiting for pending results... 28011 1726882544.71805: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 28011 1726882544.71809: in run() - task 12673a56-9f93-962d-7c65-000000000027 28011 1726882544.71813: variable 'ansible_search_path' from source: unknown 28011 1726882544.71816: variable 'ansible_search_path' from source: unknown 28011 1726882544.71819: calling self._execute() 28011 1726882544.72300: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882544.72305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882544.72309: variable 'omit' from source: magic vars 28011 1726882544.73015: variable 'ansible_distribution_major_version' from source: facts 28011 1726882544.73034: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882544.73277: variable 'network_state' from source: role '' defaults 28011 1726882544.73297: Evaluated conditional (network_state != {}): False 28011 1726882544.73401: when evaluation is False, skipping this task 28011 1726882544.73410: _execute() done 28011 1726882544.73420: dumping result to json 28011 1726882544.73428: done dumping result, returning 28011 1726882544.73440: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-962d-7c65-000000000027] 28011 1726882544.73450: sending task result for task 12673a56-9f93-962d-7c65-000000000027 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882544.73601: no more pending results, returning what we have 28011 1726882544.73604: results queue empty 28011 1726882544.73605: checking for any_errors_fatal 28011 1726882544.73621: done checking for any_errors_fatal 28011 1726882544.73622: checking for max_fail_percentage 28011 1726882544.73623: done checking for max_fail_percentage 28011 1726882544.73624: checking to see if all hosts have failed and the running result is not ok 28011 1726882544.73625: done checking to see if all hosts have failed 28011 1726882544.73625: getting the remaining hosts for this loop 28011 1726882544.73627: done getting the remaining hosts for this loop 28011 1726882544.73630: getting the next task for host managed_node1 28011 1726882544.73637: done getting next task for host managed_node1 28011 1726882544.73641: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28011 1726882544.73644: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882544.73659: getting variables 28011 1726882544.73661: in VariableManager get_vars() 28011 1726882544.73708: Calling all_inventory to load vars for managed_node1 28011 1726882544.73711: Calling groups_inventory to load vars for managed_node1 28011 1726882544.73714: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882544.73726: Calling all_plugins_play to load vars for managed_node1 28011 1726882544.73729: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882544.73732: Calling groups_plugins_play to load vars for managed_node1 28011 1726882544.74301: done sending task result for task 12673a56-9f93-962d-7c65-000000000027 28011 1726882544.74304: WORKER PROCESS EXITING 28011 1726882544.76800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882544.79968: done with get_vars() 28011 1726882544.80104: done getting variables 28011 1726882544.80162: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:35:44 -0400 (0:00:00.097) 0:00:14.353 ****** 28011 1726882544.80316: entering _queue_task() for managed_node1/debug 28011 1726882544.80957: worker is 1 (out of 1 available) 28011 1726882544.81195: exiting _queue_task() for managed_node1/debug 28011 1726882544.81206: done queuing things up, now waiting for results queue to drain 28011 1726882544.81208: waiting for pending results... 28011 1726882544.81501: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28011 1726882544.81903: in run() - task 12673a56-9f93-962d-7c65-000000000028 28011 1726882544.81907: variable 'ansible_search_path' from source: unknown 28011 1726882544.81910: variable 'ansible_search_path' from source: unknown 28011 1726882544.82035: calling self._execute() 28011 1726882544.82208: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882544.82236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882544.82292: variable 'omit' from source: magic vars 28011 1726882544.83074: variable 'ansible_distribution_major_version' from source: facts 28011 1726882544.83253: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882544.83257: variable 'omit' from source: magic vars 28011 1726882544.83273: variable 'omit' from source: magic vars 28011 1726882544.83357: variable 'omit' from source: magic vars 28011 1726882544.83535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882544.83547: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882544.83572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882544.83715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882544.83801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882544.83805: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882544.83807: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882544.83810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882544.83890: Set connection var ansible_connection to ssh 28011 1726882544.84028: Set connection var ansible_pipelining to False 28011 1726882544.84044: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882544.84054: Set connection var ansible_shell_executable to /bin/sh 28011 1726882544.84066: Set connection var ansible_timeout to 10 28011 1726882544.84074: Set connection var ansible_shell_type to sh 28011 1726882544.84103: variable 'ansible_shell_executable' from source: unknown 28011 1726882544.84155: variable 'ansible_connection' from source: unknown 28011 1726882544.84163: variable 'ansible_module_compression' from source: unknown 28011 1726882544.84170: variable 'ansible_shell_type' from source: unknown 28011 1726882544.84176: variable 'ansible_shell_executable' from source: unknown 28011 1726882544.84182: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882544.84189: variable 'ansible_pipelining' from source: unknown 28011 1726882544.84258: variable 'ansible_timeout' from source: unknown 28011 1726882544.84261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882544.84584: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882544.84588: variable 'omit' from source: magic vars 28011 1726882544.84590: starting attempt loop 28011 1726882544.84592: running the handler 28011 1726882544.85010: variable '__network_connections_result' from source: set_fact 28011 1726882544.85013: handler run complete 28011 1726882544.85015: attempt loop complete, returning result 28011 1726882544.85017: _execute() done 28011 1726882544.85019: dumping result to json 28011 1726882544.85021: done dumping result, returning 28011 1726882544.85023: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-962d-7c65-000000000028] 28011 1726882544.85123: sending task result for task 12673a56-9f93-962d-7c65-000000000028 28011 1726882544.85401: done sending task result for task 12673a56-9f93-962d-7c65-000000000028 28011 1726882544.85403: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 (not-active)" ] } 28011 1726882544.85470: no more pending results, returning what we have 28011 1726882544.85474: results queue empty 28011 1726882544.85475: checking for any_errors_fatal 28011 1726882544.85484: done checking for any_errors_fatal 28011 1726882544.85485: checking for max_fail_percentage 28011 1726882544.85486: done checking for max_fail_percentage 28011 1726882544.85487: checking to see if all hosts have failed and the running result is not ok 28011 1726882544.85488: done checking to see if all hosts have failed 28011 1726882544.85489: getting the remaining hosts for this loop 28011 1726882544.85490: done getting the remaining hosts for this loop 28011 1726882544.85496: getting the next task for host managed_node1 28011 1726882544.85503: done getting next task for host managed_node1 28011 1726882544.85507: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28011 1726882544.85510: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882544.85522: getting variables 28011 1726882544.85524: in VariableManager get_vars() 28011 1726882544.85566: Calling all_inventory to load vars for managed_node1 28011 1726882544.85569: Calling groups_inventory to load vars for managed_node1 28011 1726882544.85571: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882544.85581: Calling all_plugins_play to load vars for managed_node1 28011 1726882544.85583: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882544.85586: Calling groups_plugins_play to load vars for managed_node1 28011 1726882544.88749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882544.92258: done with get_vars() 28011 1726882544.92283: done getting variables 28011 1726882544.92515: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:35:44 -0400 (0:00:00.124) 0:00:14.477 ****** 28011 1726882544.92666: entering _queue_task() for managed_node1/debug 28011 1726882544.93376: worker is 1 (out of 1 available) 28011 1726882544.93389: exiting _queue_task() for managed_node1/debug 28011 1726882544.93409: done queuing things up, now waiting for results queue to drain 28011 1726882544.93411: waiting for pending results... 28011 1726882544.94092: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28011 1726882544.94619: in run() - task 12673a56-9f93-962d-7c65-000000000029 28011 1726882544.94635: variable 'ansible_search_path' from source: unknown 28011 1726882544.94639: variable 'ansible_search_path' from source: unknown 28011 1726882544.94701: calling self._execute() 28011 1726882544.94781: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882544.94800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882544.95205: variable 'omit' from source: magic vars 28011 1726882544.96010: variable 'ansible_distribution_major_version' from source: facts 28011 1726882544.96417: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882544.96445: variable 'omit' from source: magic vars 28011 1726882544.96484: variable 'omit' from source: magic vars 28011 1726882544.96525: variable 'omit' from source: magic vars 28011 1726882544.96598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882544.97004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882544.97026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882544.97043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882544.97109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882544.97112: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882544.97114: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882544.97116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882544.97621: Set connection var ansible_connection to ssh 28011 1726882544.97629: Set connection var ansible_pipelining to False 28011 1726882544.97635: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882544.97641: Set connection var ansible_shell_executable to /bin/sh 28011 1726882544.97651: Set connection var ansible_timeout to 10 28011 1726882544.97653: Set connection var ansible_shell_type to sh 28011 1726882544.97679: variable 'ansible_shell_executable' from source: unknown 28011 1726882544.97682: variable 'ansible_connection' from source: unknown 28011 1726882544.97685: variable 'ansible_module_compression' from source: unknown 28011 1726882544.97687: variable 'ansible_shell_type' from source: unknown 28011 1726882544.97690: variable 'ansible_shell_executable' from source: unknown 28011 1726882544.97696: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882544.97701: variable 'ansible_pipelining' from source: unknown 28011 1726882544.97703: variable 'ansible_timeout' from source: unknown 28011 1726882544.97708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882544.98251: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882544.98263: variable 'omit' from source: magic vars 28011 1726882544.98268: starting attempt loop 28011 1726882544.98271: running the handler 28011 1726882544.98325: variable '__network_connections_result' from source: set_fact 28011 1726882544.98806: variable '__network_connections_result' from source: set_fact 28011 1726882544.98983: handler run complete 28011 1726882544.99430: attempt loop complete, returning result 28011 1726882544.99433: _execute() done 28011 1726882544.99436: dumping result to json 28011 1726882544.99440: done dumping result, returning 28011 1726882544.99450: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-962d-7c65-000000000029] 28011 1726882544.99455: sending task result for task 12673a56-9f93-962d-7c65-000000000029 ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": 30200 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 (not-active)" ] } } 28011 1726882544.99779: no more pending results, returning what we have 28011 1726882544.99783: results queue empty 28011 1726882544.99784: checking for any_errors_fatal 28011 1726882544.99791: done checking for any_errors_fatal 28011 1726882544.99792: checking for max_fail_percentage 28011 1726882544.99796: done checking for max_fail_percentage 28011 1726882544.99797: checking to see if all hosts have failed and the running result is not ok 28011 1726882544.99797: done checking to see if all hosts have failed 28011 1726882544.99798: getting the remaining hosts for this loop 28011 1726882544.99800: done getting the remaining hosts for this loop 28011 1726882544.99804: getting the next task for host managed_node1 28011 1726882544.99818: done getting next task for host managed_node1 28011 1726882544.99822: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28011 1726882544.99826: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882544.99839: getting variables 28011 1726882544.99841: in VariableManager get_vars() 28011 1726882544.99882: Calling all_inventory to load vars for managed_node1 28011 1726882544.99885: Calling groups_inventory to load vars for managed_node1 28011 1726882544.99888: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882545.00001: Calling all_plugins_play to load vars for managed_node1 28011 1726882545.00118: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882545.00123: Calling groups_plugins_play to load vars for managed_node1 28011 1726882545.00735: done sending task result for task 12673a56-9f93-962d-7c65-000000000029 28011 1726882545.00738: WORKER PROCESS EXITING 28011 1726882545.02844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882545.06167: done with get_vars() 28011 1726882545.06203: done getting variables 28011 1726882545.06331: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:35:45 -0400 (0:00:00.138) 0:00:14.616 ****** 28011 1726882545.06485: entering _queue_task() for managed_node1/debug 28011 1726882545.07337: worker is 1 (out of 1 available) 28011 1726882545.07349: exiting _queue_task() for managed_node1/debug 28011 1726882545.07359: done queuing things up, now waiting for results queue to drain 28011 1726882545.07361: waiting for pending results... 28011 1726882545.07695: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28011 1726882545.08158: in run() - task 12673a56-9f93-962d-7c65-00000000002a 28011 1726882545.08161: variable 'ansible_search_path' from source: unknown 28011 1726882545.08164: variable 'ansible_search_path' from source: unknown 28011 1726882545.08166: calling self._execute() 28011 1726882545.08484: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882545.08488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882545.08490: variable 'omit' from source: magic vars 28011 1726882545.09261: variable 'ansible_distribution_major_version' from source: facts 28011 1726882545.09280: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882545.09414: variable 'network_state' from source: role '' defaults 28011 1726882545.09491: Evaluated conditional (network_state != {}): False 28011 1726882545.09501: when evaluation is False, skipping this task 28011 1726882545.09508: _execute() done 28011 1726882545.09515: dumping result to json 28011 1726882545.09522: done dumping result, returning 28011 1726882545.09534: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-962d-7c65-00000000002a] 28011 1726882545.09599: sending task result for task 12673a56-9f93-962d-7c65-00000000002a skipping: [managed_node1] => { "false_condition": "network_state != {}" } 28011 1726882545.09764: no more pending results, returning what we have 28011 1726882545.09768: results queue empty 28011 1726882545.09769: checking for any_errors_fatal 28011 1726882545.09779: done checking for any_errors_fatal 28011 1726882545.09780: checking for max_fail_percentage 28011 1726882545.09782: done checking for max_fail_percentage 28011 1726882545.09783: checking to see if all hosts have failed and the running result is not ok 28011 1726882545.09784: done checking to see if all hosts have failed 28011 1726882545.09784: getting the remaining hosts for this loop 28011 1726882545.09786: done getting the remaining hosts for this loop 28011 1726882545.09789: getting the next task for host managed_node1 28011 1726882545.09798: done getting next task for host managed_node1 28011 1726882545.09802: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28011 1726882545.09805: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882545.09824: getting variables 28011 1726882545.09828: in VariableManager get_vars() 28011 1726882545.09871: Calling all_inventory to load vars for managed_node1 28011 1726882545.09873: Calling groups_inventory to load vars for managed_node1 28011 1726882545.09876: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882545.09888: Calling all_plugins_play to load vars for managed_node1 28011 1726882545.09891: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882545.10102: Calling groups_plugins_play to load vars for managed_node1 28011 1726882545.10834: done sending task result for task 12673a56-9f93-962d-7c65-00000000002a 28011 1726882545.10837: WORKER PROCESS EXITING 28011 1726882545.13316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882545.16682: done with get_vars() 28011 1726882545.16713: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:35:45 -0400 (0:00:00.104) 0:00:14.720 ****** 28011 1726882545.16944: entering _queue_task() for managed_node1/ping 28011 1726882545.16946: Creating lock for ping 28011 1726882545.17556: worker is 1 (out of 1 available) 28011 1726882545.17568: exiting _queue_task() for managed_node1/ping 28011 1726882545.17579: done queuing things up, now waiting for results queue to drain 28011 1726882545.17581: waiting for pending results... 28011 1726882545.18176: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 28011 1726882545.18592: in run() - task 12673a56-9f93-962d-7c65-00000000002b 28011 1726882545.18606: variable 'ansible_search_path' from source: unknown 28011 1726882545.18723: variable 'ansible_search_path' from source: unknown 28011 1726882545.18727: calling self._execute() 28011 1726882545.18989: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882545.18994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882545.18997: variable 'omit' from source: magic vars 28011 1726882545.19642: variable 'ansible_distribution_major_version' from source: facts 28011 1726882545.19646: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882545.19649: variable 'omit' from source: magic vars 28011 1726882545.19817: variable 'omit' from source: magic vars 28011 1726882545.19856: variable 'omit' from source: magic vars 28011 1726882545.19972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882545.20134: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882545.20137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882545.20143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882545.20199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882545.20351: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882545.20358: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882545.20360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882545.20599: Set connection var ansible_connection to ssh 28011 1726882545.20602: Set connection var ansible_pipelining to False 28011 1726882545.20605: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882545.20644: Set connection var ansible_shell_executable to /bin/sh 28011 1726882545.20657: Set connection var ansible_timeout to 10 28011 1726882545.20691: Set connection var ansible_shell_type to sh 28011 1726882545.20767: variable 'ansible_shell_executable' from source: unknown 28011 1726882545.21179: variable 'ansible_connection' from source: unknown 28011 1726882545.21182: variable 'ansible_module_compression' from source: unknown 28011 1726882545.21184: variable 'ansible_shell_type' from source: unknown 28011 1726882545.21186: variable 'ansible_shell_executable' from source: unknown 28011 1726882545.21188: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882545.21190: variable 'ansible_pipelining' from source: unknown 28011 1726882545.21192: variable 'ansible_timeout' from source: unknown 28011 1726882545.21196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882545.21521: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882545.21751: variable 'omit' from source: magic vars 28011 1726882545.21754: starting attempt loop 28011 1726882545.21756: running the handler 28011 1726882545.21758: _low_level_execute_command(): starting 28011 1726882545.21861: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882545.23064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882545.23081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882545.23539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882545.23612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882545.23989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882545.25704: stdout chunk (state=3): >>>/root <<< 28011 1726882545.25759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882545.25798: stderr chunk (state=3): >>><<< 28011 1726882545.25902: stdout chunk (state=3): >>><<< 28011 1726882545.25921: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882545.25940: _low_level_execute_command(): starting 28011 1726882545.25950: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518 `" && echo ansible-tmp-1726882545.2592757-28727-264193481908518="` echo /root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518 `" ) && sleep 0' 28011 1726882545.27409: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882545.27617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882545.27637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882545.27705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882545.29598: stdout chunk (state=3): >>>ansible-tmp-1726882545.2592757-28727-264193481908518=/root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518 <<< 28011 1726882545.29758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882545.29761: stdout chunk (state=3): >>><<< 28011 1726882545.29769: stderr chunk (state=3): >>><<< 28011 1726882545.29788: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882545.2592757-28727-264193481908518=/root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882545.29837: variable 'ansible_module_compression' from source: unknown 28011 1726882545.29878: ANSIBALLZ: Using lock for ping 28011 1726882545.29881: ANSIBALLZ: Acquiring lock 28011 1726882545.29884: ANSIBALLZ: Lock acquired: 139767560923440 28011 1726882545.29886: ANSIBALLZ: Creating module 28011 1726882545.49757: ANSIBALLZ: Writing module into payload 28011 1726882545.49761: ANSIBALLZ: Writing module 28011 1726882545.49763: ANSIBALLZ: Renaming module 28011 1726882545.49765: ANSIBALLZ: Done creating module 28011 1726882545.49778: variable 'ansible_facts' from source: unknown 28011 1726882545.49974: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518/AnsiballZ_ping.py 28011 1726882545.50068: Sending initial data 28011 1726882545.50080: Sent initial data (153 bytes) 28011 1726882545.50574: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882545.50583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882545.50596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882545.50848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882545.50852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882545.50854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882545.50856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882545.52417: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882545.52455: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882545.52494: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpwr3wtnao /root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518/AnsiballZ_ping.py <<< 28011 1726882545.52498: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518/AnsiballZ_ping.py" <<< 28011 1726882545.52549: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpwr3wtnao" to remote "/root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518/AnsiballZ_ping.py" <<< 28011 1726882545.53729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882545.53797: stderr chunk (state=3): >>><<< 28011 1726882545.53800: stdout chunk (state=3): >>><<< 28011 1726882545.53829: done transferring module to remote 28011 1726882545.53839: _low_level_execute_command(): starting 28011 1726882545.53845: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518/ /root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518/AnsiballZ_ping.py && sleep 0' 28011 1726882545.55099: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882545.55103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882545.55121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882545.55126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882545.55207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882545.55324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882545.55332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882545.55352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882545.55521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882545.57373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882545.57376: stderr chunk (state=3): >>><<< 28011 1726882545.57381: stdout chunk (state=3): >>><<< 28011 1726882545.57401: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882545.57405: _low_level_execute_command(): starting 28011 1726882545.57410: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518/AnsiballZ_ping.py && sleep 0' 28011 1726882545.58504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882545.58508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882545.58810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882545.58911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882545.58987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882545.73785: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28011 1726882545.74946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882545.74972: stderr chunk (state=3): >>><<< 28011 1726882545.74982: stdout chunk (state=3): >>><<< 28011 1726882545.75008: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882545.75102: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882545.75119: _low_level_execute_command(): starting 28011 1726882545.75130: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882545.2592757-28727-264193481908518/ > /dev/null 2>&1 && sleep 0' 28011 1726882545.75804: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882545.75807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882545.75809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882545.75811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882545.75814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882545.75816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882545.75869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882545.75889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882545.75908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882545.75985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882545.78100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882545.78104: stdout chunk (state=3): >>><<< 28011 1726882545.78107: stderr chunk (state=3): >>><<< 28011 1726882545.78110: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882545.78117: handler run complete 28011 1726882545.78120: attempt loop complete, returning result 28011 1726882545.78122: _execute() done 28011 1726882545.78124: dumping result to json 28011 1726882545.78126: done dumping result, returning 28011 1726882545.78128: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-962d-7c65-00000000002b] 28011 1726882545.78130: sending task result for task 12673a56-9f93-962d-7c65-00000000002b 28011 1726882545.78202: done sending task result for task 12673a56-9f93-962d-7c65-00000000002b 28011 1726882545.78206: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 28011 1726882545.78452: no more pending results, returning what we have 28011 1726882545.78456: results queue empty 28011 1726882545.78456: checking for any_errors_fatal 28011 1726882545.78463: done checking for any_errors_fatal 28011 1726882545.78464: checking for max_fail_percentage 28011 1726882545.78466: done checking for max_fail_percentage 28011 1726882545.78466: checking to see if all hosts have failed and the running result is not ok 28011 1726882545.78467: done checking to see if all hosts have failed 28011 1726882545.78468: getting the remaining hosts for this loop 28011 1726882545.78469: done getting the remaining hosts for this loop 28011 1726882545.78473: getting the next task for host managed_node1 28011 1726882545.78483: done getting next task for host managed_node1 28011 1726882545.78485: ^ task is: TASK: meta (role_complete) 28011 1726882545.78488: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882545.78500: getting variables 28011 1726882545.78502: in VariableManager get_vars() 28011 1726882545.78545: Calling all_inventory to load vars for managed_node1 28011 1726882545.78547: Calling groups_inventory to load vars for managed_node1 28011 1726882545.78550: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882545.78559: Calling all_plugins_play to load vars for managed_node1 28011 1726882545.78562: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882545.78564: Calling groups_plugins_play to load vars for managed_node1 28011 1726882545.80066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882545.81576: done with get_vars() 28011 1726882545.81604: done getting variables 28011 1726882545.81689: done queuing things up, now waiting for results queue to drain 28011 1726882545.81691: results queue empty 28011 1726882545.81692: checking for any_errors_fatal 28011 1726882545.81696: done checking for any_errors_fatal 28011 1726882545.81697: checking for max_fail_percentage 28011 1726882545.81698: done checking for max_fail_percentage 28011 1726882545.81699: checking to see if all hosts have failed and the running result is not ok 28011 1726882545.81700: done checking to see if all hosts have failed 28011 1726882545.81700: getting the remaining hosts for this loop 28011 1726882545.81701: done getting the remaining hosts for this loop 28011 1726882545.81704: getting the next task for host managed_node1 28011 1726882545.81708: done getting next task for host managed_node1 28011 1726882545.81710: ^ task is: TASK: Get the routes from the route table 30200 28011 1726882545.81712: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882545.81714: getting variables 28011 1726882545.81715: in VariableManager get_vars() 28011 1726882545.81730: Calling all_inventory to load vars for managed_node1 28011 1726882545.81732: Calling groups_inventory to load vars for managed_node1 28011 1726882545.81734: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882545.81739: Calling all_plugins_play to load vars for managed_node1 28011 1726882545.81741: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882545.81744: Calling groups_plugins_play to load vars for managed_node1 28011 1726882545.82979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882545.84502: done with get_vars() 28011 1726882545.84528: done getting variables 28011 1726882545.84571: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the routes from the route table 30200] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:56 Friday 20 September 2024 21:35:45 -0400 (0:00:00.676) 0:00:15.397 ****** 28011 1726882545.84600: entering _queue_task() for managed_node1/command 28011 1726882545.85012: worker is 1 (out of 1 available) 28011 1726882545.85025: exiting _queue_task() for managed_node1/command 28011 1726882545.85036: done queuing things up, now waiting for results queue to drain 28011 1726882545.85037: waiting for pending results... 28011 1726882545.85415: running TaskExecutor() for managed_node1/TASK: Get the routes from the route table 30200 28011 1726882545.85439: in run() - task 12673a56-9f93-962d-7c65-00000000005b 28011 1726882545.85461: variable 'ansible_search_path' from source: unknown 28011 1726882545.85506: calling self._execute() 28011 1726882545.85641: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882545.85644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882545.85647: variable 'omit' from source: magic vars 28011 1726882545.86032: variable 'ansible_distribution_major_version' from source: facts 28011 1726882545.86051: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882545.86076: variable 'omit' from source: magic vars 28011 1726882545.86091: variable 'omit' from source: magic vars 28011 1726882545.86186: variable 'omit' from source: magic vars 28011 1726882545.86189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882545.86221: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882545.86247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882545.86272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882545.86290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882545.86330: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882545.86339: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882545.86347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882545.86454: Set connection var ansible_connection to ssh 28011 1726882545.86467: Set connection var ansible_pipelining to False 28011 1726882545.86515: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882545.86518: Set connection var ansible_shell_executable to /bin/sh 28011 1726882545.86520: Set connection var ansible_timeout to 10 28011 1726882545.86522: Set connection var ansible_shell_type to sh 28011 1726882545.86536: variable 'ansible_shell_executable' from source: unknown 28011 1726882545.86545: variable 'ansible_connection' from source: unknown 28011 1726882545.86552: variable 'ansible_module_compression' from source: unknown 28011 1726882545.86559: variable 'ansible_shell_type' from source: unknown 28011 1726882545.86565: variable 'ansible_shell_executable' from source: unknown 28011 1726882545.86597: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882545.86600: variable 'ansible_pipelining' from source: unknown 28011 1726882545.86602: variable 'ansible_timeout' from source: unknown 28011 1726882545.86604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882545.86756: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882545.86772: variable 'omit' from source: magic vars 28011 1726882545.86781: starting attempt loop 28011 1726882545.86841: running the handler 28011 1726882545.86843: _low_level_execute_command(): starting 28011 1726882545.86845: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882545.87522: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882545.87537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882545.87552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882545.87575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882545.87591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882545.87685: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882545.87705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882545.87732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882545.87749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882545.87920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882545.89496: stdout chunk (state=3): >>>/root <<< 28011 1726882545.89633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882545.89648: stderr chunk (state=3): >>><<< 28011 1726882545.89732: stdout chunk (state=3): >>><<< 28011 1726882545.89735: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882545.89755: _low_level_execute_command(): starting 28011 1726882545.89840: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393 `" && echo ansible-tmp-1726882545.8974268-28745-81086027801393="` echo /root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393 `" ) && sleep 0' 28011 1726882545.91155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882545.91244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882545.91412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882545.91465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882545.91510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882545.93376: stdout chunk (state=3): >>>ansible-tmp-1726882545.8974268-28745-81086027801393=/root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393 <<< 28011 1726882545.93481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882545.93699: stderr chunk (state=3): >>><<< 28011 1726882545.93703: stdout chunk (state=3): >>><<< 28011 1726882545.93706: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882545.8974268-28745-81086027801393=/root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882545.93709: variable 'ansible_module_compression' from source: unknown 28011 1726882545.93758: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28011 1726882545.93880: variable 'ansible_facts' from source: unknown 28011 1726882545.94200: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393/AnsiballZ_command.py 28011 1726882545.94421: Sending initial data 28011 1726882545.94430: Sent initial data (155 bytes) 28011 1726882545.95591: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882545.95811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882545.95940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882545.95972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882545.97496: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28011 1726882545.97509: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882545.97677: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882545.97786: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp9k_y8q7r /root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393/AnsiballZ_command.py <<< 28011 1726882545.97799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393/AnsiballZ_command.py" <<< 28011 1726882545.98038: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp9k_y8q7r" to remote "/root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393/AnsiballZ_command.py" <<< 28011 1726882545.99901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882545.99911: stdout chunk (state=3): >>><<< 28011 1726882545.99922: stderr chunk (state=3): >>><<< 28011 1726882545.99953: done transferring module to remote 28011 1726882546.00012: _low_level_execute_command(): starting 28011 1726882546.00023: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393/ /root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393/AnsiballZ_command.py && sleep 0' 28011 1726882546.01220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882546.01531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.01547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882546.01562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882546.01632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882546.03469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882546.03479: stdout chunk (state=3): >>><<< 28011 1726882546.03497: stderr chunk (state=3): >>><<< 28011 1726882546.03518: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882546.03530: _low_level_execute_command(): starting 28011 1726882546.03538: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393/AnsiballZ_command.py && sleep 0' 28011 1726882546.04736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882546.04769: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882546.04909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.04979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882546.05001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882546.05092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882546.05331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882546.20614: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30200"], "start": "2024-09-20 21:35:46.201380", "end": "2024-09-20 21:35:46.204980", "delta": "0:00:00.003600", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882546.22005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882546.22022: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 28011 1726882546.22120: stderr chunk (state=3): >>><<< 28011 1726882546.22130: stdout chunk (state=3): >>><<< 28011 1726882546.22198: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30200"], "start": "2024-09-20 21:35:46.201380", "end": "2024-09-20 21:35:46.204980", "delta": "0:00:00.003600", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882546.22402: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route show table 30200', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882546.22406: _low_level_execute_command(): starting 28011 1726882546.22409: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882545.8974268-28745-81086027801393/ > /dev/null 2>&1 && sleep 0' 28011 1726882546.23836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882546.23851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882546.24051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882546.24062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.24456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882546.24478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882546.24528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882546.26297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882546.26328: stderr chunk (state=3): >>><<< 28011 1726882546.26337: stdout chunk (state=3): >>><<< 28011 1726882546.26699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882546.26708: handler run complete 28011 1726882546.26710: Evaluated conditional (False): False 28011 1726882546.26712: attempt loop complete, returning result 28011 1726882546.26714: _execute() done 28011 1726882546.26716: dumping result to json 28011 1726882546.26718: done dumping result, returning 28011 1726882546.26720: done running TaskExecutor() for managed_node1/TASK: Get the routes from the route table 30200 [12673a56-9f93-962d-7c65-00000000005b] 28011 1726882546.26722: sending task result for task 12673a56-9f93-962d-7c65-00000000005b 28011 1726882546.26798: done sending task result for task 12673a56-9f93-962d-7c65-00000000005b 28011 1726882546.26802: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "route", "show", "table", "30200" ], "delta": "0:00:00.003600", "end": "2024-09-20 21:35:46.204980", "rc": 0, "start": "2024-09-20 21:35:46.201380" } STDOUT: 192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 28011 1726882546.26880: no more pending results, returning what we have 28011 1726882546.26883: results queue empty 28011 1726882546.26884: checking for any_errors_fatal 28011 1726882546.26886: done checking for any_errors_fatal 28011 1726882546.26887: checking for max_fail_percentage 28011 1726882546.26888: done checking for max_fail_percentage 28011 1726882546.26892: checking to see if all hosts have failed and the running result is not ok 28011 1726882546.26894: done checking to see if all hosts have failed 28011 1726882546.26895: getting the remaining hosts for this loop 28011 1726882546.26897: done getting the remaining hosts for this loop 28011 1726882546.26901: getting the next task for host managed_node1 28011 1726882546.26907: done getting next task for host managed_node1 28011 1726882546.26911: ^ task is: TASK: Get the routes from the route table 30400 28011 1726882546.26913: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882546.26918: getting variables 28011 1726882546.26920: in VariableManager get_vars() 28011 1726882546.26961: Calling all_inventory to load vars for managed_node1 28011 1726882546.26964: Calling groups_inventory to load vars for managed_node1 28011 1726882546.26966: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882546.26978: Calling all_plugins_play to load vars for managed_node1 28011 1726882546.26981: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882546.26984: Calling groups_plugins_play to load vars for managed_node1 28011 1726882546.30469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882546.33431: done with get_vars() 28011 1726882546.33994: done getting variables 28011 1726882546.34067: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the routes from the route table 30400] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:62 Friday 20 September 2024 21:35:46 -0400 (0:00:00.494) 0:00:15.892 ****** 28011 1726882546.34100: entering _queue_task() for managed_node1/command 28011 1726882546.35364: worker is 1 (out of 1 available) 28011 1726882546.35377: exiting _queue_task() for managed_node1/command 28011 1726882546.35389: done queuing things up, now waiting for results queue to drain 28011 1726882546.35391: waiting for pending results... 28011 1726882546.36511: running TaskExecutor() for managed_node1/TASK: Get the routes from the route table 30400 28011 1726882546.36516: in run() - task 12673a56-9f93-962d-7c65-00000000005c 28011 1726882546.36521: variable 'ansible_search_path' from source: unknown 28011 1726882546.36702: calling self._execute() 28011 1726882546.36806: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882546.37281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882546.37284: variable 'omit' from source: magic vars 28011 1726882546.37847: variable 'ansible_distribution_major_version' from source: facts 28011 1726882546.38113: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882546.38298: variable 'omit' from source: magic vars 28011 1726882546.38301: variable 'omit' from source: magic vars 28011 1726882546.38304: variable 'omit' from source: magic vars 28011 1726882546.38306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882546.38309: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882546.38311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882546.38401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882546.38421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882546.38458: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882546.38467: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882546.38475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882546.38578: Set connection var ansible_connection to ssh 28011 1726882546.38708: Set connection var ansible_pipelining to False 28011 1726882546.38718: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882546.38725: Set connection var ansible_shell_executable to /bin/sh 28011 1726882546.38734: Set connection var ansible_timeout to 10 28011 1726882546.38741: Set connection var ansible_shell_type to sh 28011 1726882546.38764: variable 'ansible_shell_executable' from source: unknown 28011 1726882546.39098: variable 'ansible_connection' from source: unknown 28011 1726882546.39102: variable 'ansible_module_compression' from source: unknown 28011 1726882546.39105: variable 'ansible_shell_type' from source: unknown 28011 1726882546.39107: variable 'ansible_shell_executable' from source: unknown 28011 1726882546.39109: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882546.39111: variable 'ansible_pipelining' from source: unknown 28011 1726882546.39113: variable 'ansible_timeout' from source: unknown 28011 1726882546.39115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882546.39118: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882546.39120: variable 'omit' from source: magic vars 28011 1726882546.39122: starting attempt loop 28011 1726882546.39125: running the handler 28011 1726882546.39510: _low_level_execute_command(): starting 28011 1726882546.39525: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882546.40664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882546.40912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882546.41111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882546.41571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882546.43150: stdout chunk (state=3): >>>/root <<< 28011 1726882546.43296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882546.43311: stderr chunk (state=3): >>><<< 28011 1726882546.43321: stdout chunk (state=3): >>><<< 28011 1726882546.43347: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882546.43364: _low_level_execute_command(): starting 28011 1726882546.43371: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664 `" && echo ansible-tmp-1726882546.4334762-28763-135169646575664="` echo /root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664 `" ) && sleep 0' 28011 1726882546.44397: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882546.44401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882546.44529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882546.44532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.44570: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882546.44583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882546.44885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.44945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882546.45043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882546.46998: stdout chunk (state=3): >>>ansible-tmp-1726882546.4334762-28763-135169646575664=/root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664 <<< 28011 1726882546.47001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882546.47004: stderr chunk (state=3): >>><<< 28011 1726882546.47006: stdout chunk (state=3): >>><<< 28011 1726882546.47352: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882546.4334762-28763-135169646575664=/root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882546.47358: variable 'ansible_module_compression' from source: unknown 28011 1726882546.47413: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28011 1726882546.47456: variable 'ansible_facts' from source: unknown 28011 1726882546.48013: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664/AnsiballZ_command.py 28011 1726882546.48417: Sending initial data 28011 1726882546.48420: Sent initial data (156 bytes) 28011 1726882546.49374: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.49482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882546.49521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882546.51010: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28011 1726882546.51014: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882546.51104: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882546.51149: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpcexq9y1b /root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664/AnsiballZ_command.py <<< 28011 1726882546.51155: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664/AnsiballZ_command.py" <<< 28011 1726882546.51199: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpcexq9y1b" to remote "/root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664/AnsiballZ_command.py" <<< 28011 1726882546.52510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882546.52549: stdout chunk (state=3): >>><<< 28011 1726882546.52762: stderr chunk (state=3): >>><<< 28011 1726882546.52765: done transferring module to remote 28011 1726882546.52788: _low_level_execute_command(): starting 28011 1726882546.52794: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664/ /root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664/AnsiballZ_command.py && sleep 0' 28011 1726882546.53480: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882546.53508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882546.53543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.53649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882546.53678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882546.53812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882546.55537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882546.55699: stderr chunk (state=3): >>><<< 28011 1726882546.55702: stdout chunk (state=3): >>><<< 28011 1726882546.55705: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882546.55707: _low_level_execute_command(): starting 28011 1726882546.55709: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664/AnsiballZ_command.py && sleep 0' 28011 1726882546.56550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882546.56562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882546.56630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.56681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882546.56707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882546.56734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882546.56805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882546.72051: stdout chunk (state=3): >>> {"changed": true, "stdout": "198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30400"], "start": "2024-09-20 21:35:46.715967", "end": "2024-09-20 21:35:46.719424", "delta": "0:00:00.003457", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882546.73528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882546.73541: stderr chunk (state=3): >>><<< 28011 1726882546.73549: stdout chunk (state=3): >>><<< 28011 1726882546.73568: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30400"], "start": "2024-09-20 21:35:46.715967", "end": "2024-09-20 21:35:46.719424", "delta": "0:00:00.003457", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882546.73599: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route show table 30400', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882546.73606: _low_level_execute_command(): starting 28011 1726882546.73611: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882546.4334762-28763-135169646575664/ > /dev/null 2>&1 && sleep 0' 28011 1726882546.74155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.74230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882546.74233: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.74289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882546.74318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882546.74364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882546.74419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882546.76225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882546.76228: stderr chunk (state=3): >>><<< 28011 1726882546.76229: stdout chunk (state=3): >>><<< 28011 1726882546.76241: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882546.76248: handler run complete 28011 1726882546.76282: Evaluated conditional (False): False 28011 1726882546.76285: attempt loop complete, returning result 28011 1726882546.76287: _execute() done 28011 1726882546.76292: dumping result to json 28011 1726882546.76297: done dumping result, returning 28011 1726882546.76300: done running TaskExecutor() for managed_node1/TASK: Get the routes from the route table 30400 [12673a56-9f93-962d-7c65-00000000005c] 28011 1726882546.76302: sending task result for task 12673a56-9f93-962d-7c65-00000000005c 28011 1726882546.76412: done sending task result for task 12673a56-9f93-962d-7c65-00000000005c 28011 1726882546.76415: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "route", "show", "table", "30400" ], "delta": "0:00:00.003457", "end": "2024-09-20 21:35:46.719424", "rc": 0, "start": "2024-09-20 21:35:46.715967" } STDOUT: 198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 28011 1726882546.76502: no more pending results, returning what we have 28011 1726882546.76505: results queue empty 28011 1726882546.76506: checking for any_errors_fatal 28011 1726882546.76515: done checking for any_errors_fatal 28011 1726882546.76515: checking for max_fail_percentage 28011 1726882546.76517: done checking for max_fail_percentage 28011 1726882546.76518: checking to see if all hosts have failed and the running result is not ok 28011 1726882546.76519: done checking to see if all hosts have failed 28011 1726882546.76519: getting the remaining hosts for this loop 28011 1726882546.76521: done getting the remaining hosts for this loop 28011 1726882546.76524: getting the next task for host managed_node1 28011 1726882546.76530: done getting next task for host managed_node1 28011 1726882546.76533: ^ task is: TASK: Assert that the route table 30200 contains the specified route 28011 1726882546.76535: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882546.76539: getting variables 28011 1726882546.76541: in VariableManager get_vars() 28011 1726882546.76579: Calling all_inventory to load vars for managed_node1 28011 1726882546.76582: Calling groups_inventory to load vars for managed_node1 28011 1726882546.76584: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882546.76625: Calling all_plugins_play to load vars for managed_node1 28011 1726882546.76629: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882546.76633: Calling groups_plugins_play to load vars for managed_node1 28011 1726882546.77822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882546.79061: done with get_vars() 28011 1726882546.79075: done getting variables 28011 1726882546.79125: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the route table 30200 contains the specified route] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:68 Friday 20 September 2024 21:35:46 -0400 (0:00:00.450) 0:00:16.342 ****** 28011 1726882546.79146: entering _queue_task() for managed_node1/assert 28011 1726882546.79369: worker is 1 (out of 1 available) 28011 1726882546.79382: exiting _queue_task() for managed_node1/assert 28011 1726882546.79399: done queuing things up, now waiting for results queue to drain 28011 1726882546.79401: waiting for pending results... 28011 1726882546.79713: running TaskExecutor() for managed_node1/TASK: Assert that the route table 30200 contains the specified route 28011 1726882546.79899: in run() - task 12673a56-9f93-962d-7c65-00000000005d 28011 1726882546.79903: variable 'ansible_search_path' from source: unknown 28011 1726882546.79906: calling self._execute() 28011 1726882546.79909: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882546.79912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882546.79916: variable 'omit' from source: magic vars 28011 1726882546.80259: variable 'ansible_distribution_major_version' from source: facts 28011 1726882546.80266: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882546.80272: variable 'omit' from source: magic vars 28011 1726882546.80303: variable 'omit' from source: magic vars 28011 1726882546.80339: variable 'omit' from source: magic vars 28011 1726882546.80376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882546.80412: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882546.80434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882546.80452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882546.80462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882546.80494: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882546.80498: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882546.80500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882546.80595: Set connection var ansible_connection to ssh 28011 1726882546.80601: Set connection var ansible_pipelining to False 28011 1726882546.80607: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882546.80612: Set connection var ansible_shell_executable to /bin/sh 28011 1726882546.80621: Set connection var ansible_timeout to 10 28011 1726882546.80626: Set connection var ansible_shell_type to sh 28011 1726882546.80650: variable 'ansible_shell_executable' from source: unknown 28011 1726882546.80653: variable 'ansible_connection' from source: unknown 28011 1726882546.80656: variable 'ansible_module_compression' from source: unknown 28011 1726882546.80659: variable 'ansible_shell_type' from source: unknown 28011 1726882546.80661: variable 'ansible_shell_executable' from source: unknown 28011 1726882546.80663: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882546.80668: variable 'ansible_pipelining' from source: unknown 28011 1726882546.80670: variable 'ansible_timeout' from source: unknown 28011 1726882546.80674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882546.80813: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882546.80820: variable 'omit' from source: magic vars 28011 1726882546.80825: starting attempt loop 28011 1726882546.80828: running the handler 28011 1726882546.80972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882546.81157: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882546.81186: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882546.81244: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882546.81271: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882546.81332: variable 'route_table_30200' from source: set_fact 28011 1726882546.81357: Evaluated conditional (route_table_30200.stdout is search("198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4")): True 28011 1726882546.81457: variable 'route_table_30200' from source: set_fact 28011 1726882546.81471: Evaluated conditional (route_table_30200.stdout is search("192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50")): True 28011 1726882546.81477: handler run complete 28011 1726882546.81488: attempt loop complete, returning result 28011 1726882546.81491: _execute() done 28011 1726882546.81498: dumping result to json 28011 1726882546.81500: done dumping result, returning 28011 1726882546.81507: done running TaskExecutor() for managed_node1/TASK: Assert that the route table 30200 contains the specified route [12673a56-9f93-962d-7c65-00000000005d] 28011 1726882546.81512: sending task result for task 12673a56-9f93-962d-7c65-00000000005d ok: [managed_node1] => { "changed": false } MSG: All assertions passed 28011 1726882546.81641: no more pending results, returning what we have 28011 1726882546.81644: results queue empty 28011 1726882546.81644: checking for any_errors_fatal 28011 1726882546.81655: done checking for any_errors_fatal 28011 1726882546.81656: checking for max_fail_percentage 28011 1726882546.81657: done checking for max_fail_percentage 28011 1726882546.81658: checking to see if all hosts have failed and the running result is not ok 28011 1726882546.81659: done checking to see if all hosts have failed 28011 1726882546.81660: getting the remaining hosts for this loop 28011 1726882546.81661: done getting the remaining hosts for this loop 28011 1726882546.81666: getting the next task for host managed_node1 28011 1726882546.81671: done getting next task for host managed_node1 28011 1726882546.81674: ^ task is: TASK: Assert that the route table 30400 contains the specified route 28011 1726882546.81676: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882546.81680: getting variables 28011 1726882546.81681: in VariableManager get_vars() 28011 1726882546.81721: Calling all_inventory to load vars for managed_node1 28011 1726882546.81724: Calling groups_inventory to load vars for managed_node1 28011 1726882546.81726: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882546.81735: Calling all_plugins_play to load vars for managed_node1 28011 1726882546.81738: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882546.81740: Calling groups_plugins_play to load vars for managed_node1 28011 1726882546.82306: done sending task result for task 12673a56-9f93-962d-7c65-00000000005d 28011 1726882546.82310: WORKER PROCESS EXITING 28011 1726882546.82540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882546.87325: done with get_vars() 28011 1726882546.87345: done getting variables 28011 1726882546.87394: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the route table 30400 contains the specified route] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:76 Friday 20 September 2024 21:35:46 -0400 (0:00:00.082) 0:00:16.425 ****** 28011 1726882546.87421: entering _queue_task() for managed_node1/assert 28011 1726882546.87762: worker is 1 (out of 1 available) 28011 1726882546.87778: exiting _queue_task() for managed_node1/assert 28011 1726882546.87796: done queuing things up, now waiting for results queue to drain 28011 1726882546.87797: waiting for pending results... 28011 1726882546.88006: running TaskExecutor() for managed_node1/TASK: Assert that the route table 30400 contains the specified route 28011 1726882546.88068: in run() - task 12673a56-9f93-962d-7c65-00000000005e 28011 1726882546.88082: variable 'ansible_search_path' from source: unknown 28011 1726882546.88115: calling self._execute() 28011 1726882546.88188: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882546.88200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882546.88210: variable 'omit' from source: magic vars 28011 1726882546.88585: variable 'ansible_distribution_major_version' from source: facts 28011 1726882546.88606: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882546.88611: variable 'omit' from source: magic vars 28011 1726882546.88644: variable 'omit' from source: magic vars 28011 1726882546.88683: variable 'omit' from source: magic vars 28011 1726882546.88713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882546.88749: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882546.88768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882546.88781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882546.88789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882546.88819: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882546.88822: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882546.88825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882546.88891: Set connection var ansible_connection to ssh 28011 1726882546.88903: Set connection var ansible_pipelining to False 28011 1726882546.88906: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882546.88911: Set connection var ansible_shell_executable to /bin/sh 28011 1726882546.88919: Set connection var ansible_timeout to 10 28011 1726882546.88924: Set connection var ansible_shell_type to sh 28011 1726882546.88943: variable 'ansible_shell_executable' from source: unknown 28011 1726882546.88946: variable 'ansible_connection' from source: unknown 28011 1726882546.88948: variable 'ansible_module_compression' from source: unknown 28011 1726882546.88951: variable 'ansible_shell_type' from source: unknown 28011 1726882546.88953: variable 'ansible_shell_executable' from source: unknown 28011 1726882546.88956: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882546.88959: variable 'ansible_pipelining' from source: unknown 28011 1726882546.88962: variable 'ansible_timeout' from source: unknown 28011 1726882546.88965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882546.89080: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882546.89097: variable 'omit' from source: magic vars 28011 1726882546.89101: starting attempt loop 28011 1726882546.89104: running the handler 28011 1726882546.89227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882546.89399: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882546.89457: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882546.89483: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882546.89514: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882546.89574: variable 'route_table_30400' from source: set_fact 28011 1726882546.89597: Evaluated conditional (route_table_30400.stdout is search("198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2")): True 28011 1726882546.89602: handler run complete 28011 1726882546.89613: attempt loop complete, returning result 28011 1726882546.89616: _execute() done 28011 1726882546.89619: dumping result to json 28011 1726882546.89622: done dumping result, returning 28011 1726882546.89630: done running TaskExecutor() for managed_node1/TASK: Assert that the route table 30400 contains the specified route [12673a56-9f93-962d-7c65-00000000005e] 28011 1726882546.89632: sending task result for task 12673a56-9f93-962d-7c65-00000000005e 28011 1726882546.89713: done sending task result for task 12673a56-9f93-962d-7c65-00000000005e 28011 1726882546.89716: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 28011 1726882546.89759: no more pending results, returning what we have 28011 1726882546.89762: results queue empty 28011 1726882546.89763: checking for any_errors_fatal 28011 1726882546.89770: done checking for any_errors_fatal 28011 1726882546.89770: checking for max_fail_percentage 28011 1726882546.89772: done checking for max_fail_percentage 28011 1726882546.89772: checking to see if all hosts have failed and the running result is not ok 28011 1726882546.89773: done checking to see if all hosts have failed 28011 1726882546.89774: getting the remaining hosts for this loop 28011 1726882546.89775: done getting the remaining hosts for this loop 28011 1726882546.89778: getting the next task for host managed_node1 28011 1726882546.89783: done getting next task for host managed_node1 28011 1726882546.89786: ^ task is: TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 28011 1726882546.89788: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882546.89791: getting variables 28011 1726882546.89794: in VariableManager get_vars() 28011 1726882546.89831: Calling all_inventory to load vars for managed_node1 28011 1726882546.89834: Calling groups_inventory to load vars for managed_node1 28011 1726882546.89836: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882546.89845: Calling all_plugins_play to load vars for managed_node1 28011 1726882546.89847: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882546.89850: Calling groups_plugins_play to load vars for managed_node1 28011 1726882546.90624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882546.91487: done with get_vars() 28011 1726882546.91504: done getting variables TASK [Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:82 Friday 20 September 2024 21:35:46 -0400 (0:00:00.041) 0:00:16.466 ****** 28011 1726882546.91565: entering _queue_task() for managed_node1/lineinfile 28011 1726882546.91566: Creating lock for lineinfile 28011 1726882546.91778: worker is 1 (out of 1 available) 28011 1726882546.91795: exiting _queue_task() for managed_node1/lineinfile 28011 1726882546.91807: done queuing things up, now waiting for results queue to drain 28011 1726882546.91809: waiting for pending results... 28011 1726882546.91975: running TaskExecutor() for managed_node1/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 28011 1726882546.92036: in run() - task 12673a56-9f93-962d-7c65-00000000005f 28011 1726882546.92050: variable 'ansible_search_path' from source: unknown 28011 1726882546.92076: calling self._execute() 28011 1726882546.92156: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882546.92161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882546.92164: variable 'omit' from source: magic vars 28011 1726882546.92441: variable 'ansible_distribution_major_version' from source: facts 28011 1726882546.92451: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882546.92457: variable 'omit' from source: magic vars 28011 1726882546.92478: variable 'omit' from source: magic vars 28011 1726882546.92505: variable 'omit' from source: magic vars 28011 1726882546.92534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882546.92559: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882546.92577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882546.92594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882546.92603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882546.92625: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882546.92628: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882546.92631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882546.92698: Set connection var ansible_connection to ssh 28011 1726882546.92711: Set connection var ansible_pipelining to False 28011 1726882546.92714: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882546.92716: Set connection var ansible_shell_executable to /bin/sh 28011 1726882546.92722: Set connection var ansible_timeout to 10 28011 1726882546.92727: Set connection var ansible_shell_type to sh 28011 1726882546.92744: variable 'ansible_shell_executable' from source: unknown 28011 1726882546.92747: variable 'ansible_connection' from source: unknown 28011 1726882546.92749: variable 'ansible_module_compression' from source: unknown 28011 1726882546.92752: variable 'ansible_shell_type' from source: unknown 28011 1726882546.92755: variable 'ansible_shell_executable' from source: unknown 28011 1726882546.92757: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882546.92759: variable 'ansible_pipelining' from source: unknown 28011 1726882546.92763: variable 'ansible_timeout' from source: unknown 28011 1726882546.92767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882546.92908: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882546.92916: variable 'omit' from source: magic vars 28011 1726882546.92921: starting attempt loop 28011 1726882546.92924: running the handler 28011 1726882546.92937: _low_level_execute_command(): starting 28011 1726882546.92943: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882546.93461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882546.93465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.93470: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882546.93472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.93534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882546.93538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882546.93540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882546.93578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882546.95191: stdout chunk (state=3): >>>/root <<< 28011 1726882546.95335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882546.95338: stdout chunk (state=3): >>><<< 28011 1726882546.95340: stderr chunk (state=3): >>><<< 28011 1726882546.95448: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882546.95455: _low_level_execute_command(): starting 28011 1726882546.95458: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403 `" && echo ansible-tmp-1726882546.953657-28797-218391770382403="` echo /root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403 `" ) && sleep 0' 28011 1726882546.95906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882546.95928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882546.95938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882546.95983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882546.95987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882546.96038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882546.97967: stdout chunk (state=3): >>>ansible-tmp-1726882546.953657-28797-218391770382403=/root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403 <<< 28011 1726882546.97978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882546.98148: stderr chunk (state=3): >>><<< 28011 1726882546.98151: stdout chunk (state=3): >>><<< 28011 1726882546.98399: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882546.953657-28797-218391770382403=/root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882546.98403: variable 'ansible_module_compression' from source: unknown 28011 1726882546.98406: ANSIBALLZ: Using lock for lineinfile 28011 1726882546.98409: ANSIBALLZ: Acquiring lock 28011 1726882546.98412: ANSIBALLZ: Lock acquired: 139767561973792 28011 1726882546.98414: ANSIBALLZ: Creating module 28011 1726882547.11460: ANSIBALLZ: Writing module into payload 28011 1726882547.11583: ANSIBALLZ: Writing module 28011 1726882547.11613: ANSIBALLZ: Renaming module 28011 1726882547.11618: ANSIBALLZ: Done creating module 28011 1726882547.11643: variable 'ansible_facts' from source: unknown 28011 1726882547.11695: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403/AnsiballZ_lineinfile.py 28011 1726882547.11797: Sending initial data 28011 1726882547.11801: Sent initial data (158 bytes) 28011 1726882547.12233: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882547.12237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882547.12240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882547.12242: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882547.12244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882547.12286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882547.12296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882547.12341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882547.13910: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882547.13947: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882547.14009: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp5eto7l7x /root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403/AnsiballZ_lineinfile.py <<< 28011 1726882547.14011: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403/AnsiballZ_lineinfile.py" <<< 28011 1726882547.14039: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp5eto7l7x" to remote "/root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403/AnsiballZ_lineinfile.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403/AnsiballZ_lineinfile.py" <<< 28011 1726882547.14730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882547.14768: stderr chunk (state=3): >>><<< 28011 1726882547.14781: stdout chunk (state=3): >>><<< 28011 1726882547.14937: done transferring module to remote 28011 1726882547.14941: _low_level_execute_command(): starting 28011 1726882547.14943: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403/ /root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403/AnsiballZ_lineinfile.py && sleep 0' 28011 1726882547.15777: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882547.15784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882547.15792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882547.15940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882547.15954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882547.16323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882547.16330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882547.18361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882547.18365: stdout chunk (state=3): >>><<< 28011 1726882547.18367: stderr chunk (state=3): >>><<< 28011 1726882547.18369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882547.18372: _low_level_execute_command(): starting 28011 1726882547.18374: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403/AnsiballZ_lineinfile.py && sleep 0' 28011 1726882547.19623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882547.19658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882547.19675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882547.19687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882547.19765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882547.35439: stdout chunk (state=3): >>> {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 28011 1726882547.36670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882547.36718: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 28011 1726882547.36730: stderr chunk (state=3): >>><<< 28011 1726882547.36736: stdout chunk (state=3): >>><<< 28011 1726882547.36799: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882547.36803: done with _execute_module (lineinfile, {'path': '/etc/iproute2/rt_tables.d/table.conf', 'line': '200 custom', 'mode': '0644', 'create': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'lineinfile', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882547.36806: _low_level_execute_command(): starting 28011 1726882547.36808: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882546.953657-28797-218391770382403/ > /dev/null 2>&1 && sleep 0' 28011 1726882547.37454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882547.37463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882547.37478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882547.37564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882547.39601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882547.39605: stdout chunk (state=3): >>><<< 28011 1726882547.39608: stderr chunk (state=3): >>><<< 28011 1726882547.39610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882547.39613: handler run complete 28011 1726882547.39615: attempt loop complete, returning result 28011 1726882547.39617: _execute() done 28011 1726882547.39619: dumping result to json 28011 1726882547.39621: done dumping result, returning 28011 1726882547.39623: done running TaskExecutor() for managed_node1/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table [12673a56-9f93-962d-7c65-00000000005f] 28011 1726882547.39626: sending task result for task 12673a56-9f93-962d-7c65-00000000005f 28011 1726882547.39702: done sending task result for task 12673a56-9f93-962d-7c65-00000000005f 28011 1726882547.39705: WORKER PROCESS EXITING changed: [managed_node1] => { "backup": "", "changed": true } MSG: line added 28011 1726882547.39768: no more pending results, returning what we have 28011 1726882547.39772: results queue empty 28011 1726882547.39772: checking for any_errors_fatal 28011 1726882547.39777: done checking for any_errors_fatal 28011 1726882547.39778: checking for max_fail_percentage 28011 1726882547.39780: done checking for max_fail_percentage 28011 1726882547.39780: checking to see if all hosts have failed and the running result is not ok 28011 1726882547.39781: done checking to see if all hosts have failed 28011 1726882547.39782: getting the remaining hosts for this loop 28011 1726882547.39783: done getting the remaining hosts for this loop 28011 1726882547.39786: getting the next task for host managed_node1 28011 1726882547.39798: done getting next task for host managed_node1 28011 1726882547.39803: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28011 1726882547.39805: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882547.39822: getting variables 28011 1726882547.39824: in VariableManager get_vars() 28011 1726882547.39862: Calling all_inventory to load vars for managed_node1 28011 1726882547.39865: Calling groups_inventory to load vars for managed_node1 28011 1726882547.39867: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882547.39876: Calling all_plugins_play to load vars for managed_node1 28011 1726882547.39879: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882547.39881: Calling groups_plugins_play to load vars for managed_node1 28011 1726882547.40900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882547.42280: done with get_vars() 28011 1726882547.42307: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:35:47 -0400 (0:00:00.508) 0:00:16.975 ****** 28011 1726882547.42426: entering _queue_task() for managed_node1/include_tasks 28011 1726882547.42851: worker is 1 (out of 1 available) 28011 1726882547.42866: exiting _queue_task() for managed_node1/include_tasks 28011 1726882547.42879: done queuing things up, now waiting for results queue to drain 28011 1726882547.42880: waiting for pending results... 28011 1726882547.43354: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28011 1726882547.43360: in run() - task 12673a56-9f93-962d-7c65-000000000067 28011 1726882547.43363: variable 'ansible_search_path' from source: unknown 28011 1726882547.43366: variable 'ansible_search_path' from source: unknown 28011 1726882547.43397: calling self._execute() 28011 1726882547.43487: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882547.43495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882547.43506: variable 'omit' from source: magic vars 28011 1726882547.44017: variable 'ansible_distribution_major_version' from source: facts 28011 1726882547.44021: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882547.44024: _execute() done 28011 1726882547.44025: dumping result to json 28011 1726882547.44027: done dumping result, returning 28011 1726882547.44029: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-962d-7c65-000000000067] 28011 1726882547.44031: sending task result for task 12673a56-9f93-962d-7c65-000000000067 28011 1726882547.44149: no more pending results, returning what we have 28011 1726882547.44155: in VariableManager get_vars() 28011 1726882547.44201: Calling all_inventory to load vars for managed_node1 28011 1726882547.44204: Calling groups_inventory to load vars for managed_node1 28011 1726882547.44207: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882547.44219: Calling all_plugins_play to load vars for managed_node1 28011 1726882547.44222: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882547.44225: Calling groups_plugins_play to load vars for managed_node1 28011 1726882547.44788: done sending task result for task 12673a56-9f93-962d-7c65-000000000067 28011 1726882547.44799: WORKER PROCESS EXITING 28011 1726882547.45988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882547.47529: done with get_vars() 28011 1726882547.47554: variable 'ansible_search_path' from source: unknown 28011 1726882547.47555: variable 'ansible_search_path' from source: unknown 28011 1726882547.47596: we have included files to process 28011 1726882547.47597: generating all_blocks data 28011 1726882547.47602: done generating all_blocks data 28011 1726882547.47606: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28011 1726882547.47607: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28011 1726882547.47610: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28011 1726882547.48161: done processing included file 28011 1726882547.48163: iterating over new_blocks loaded from include file 28011 1726882547.48165: in VariableManager get_vars() 28011 1726882547.48191: done with get_vars() 28011 1726882547.48195: filtering new block on tags 28011 1726882547.48215: done filtering new block on tags 28011 1726882547.48217: in VariableManager get_vars() 28011 1726882547.48243: done with get_vars() 28011 1726882547.48245: filtering new block on tags 28011 1726882547.48265: done filtering new block on tags 28011 1726882547.48268: in VariableManager get_vars() 28011 1726882547.48291: done with get_vars() 28011 1726882547.48292: filtering new block on tags 28011 1726882547.48312: done filtering new block on tags 28011 1726882547.48314: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 28011 1726882547.48319: extending task lists for all hosts with included blocks 28011 1726882547.49088: done extending task lists 28011 1726882547.49090: done processing included files 28011 1726882547.49091: results queue empty 28011 1726882547.49091: checking for any_errors_fatal 28011 1726882547.49099: done checking for any_errors_fatal 28011 1726882547.49099: checking for max_fail_percentage 28011 1726882547.49101: done checking for max_fail_percentage 28011 1726882547.49101: checking to see if all hosts have failed and the running result is not ok 28011 1726882547.49102: done checking to see if all hosts have failed 28011 1726882547.49103: getting the remaining hosts for this loop 28011 1726882547.49104: done getting the remaining hosts for this loop 28011 1726882547.49107: getting the next task for host managed_node1 28011 1726882547.49111: done getting next task for host managed_node1 28011 1726882547.49114: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28011 1726882547.49117: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882547.49128: getting variables 28011 1726882547.49130: in VariableManager get_vars() 28011 1726882547.49147: Calling all_inventory to load vars for managed_node1 28011 1726882547.49150: Calling groups_inventory to load vars for managed_node1 28011 1726882547.49152: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882547.49158: Calling all_plugins_play to load vars for managed_node1 28011 1726882547.49161: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882547.49164: Calling groups_plugins_play to load vars for managed_node1 28011 1726882547.51374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882547.53804: done with get_vars() 28011 1726882547.53830: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:35:47 -0400 (0:00:00.114) 0:00:17.090 ****** 28011 1726882547.53917: entering _queue_task() for managed_node1/setup 28011 1726882547.54262: worker is 1 (out of 1 available) 28011 1726882547.54273: exiting _queue_task() for managed_node1/setup 28011 1726882547.54285: done queuing things up, now waiting for results queue to drain 28011 1726882547.54286: waiting for pending results... 28011 1726882547.54586: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28011 1726882547.54826: in run() - task 12673a56-9f93-962d-7c65-0000000005df 28011 1726882547.54830: variable 'ansible_search_path' from source: unknown 28011 1726882547.54833: variable 'ansible_search_path' from source: unknown 28011 1726882547.54837: calling self._execute() 28011 1726882547.54880: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882547.54884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882547.54897: variable 'omit' from source: magic vars 28011 1726882547.55602: variable 'ansible_distribution_major_version' from source: facts 28011 1726882547.55653: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882547.55898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882547.61320: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882547.61399: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882547.61475: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882547.61541: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882547.61563: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882547.61642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882547.61807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882547.61810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882547.61813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882547.61815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882547.61858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882547.61862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882547.61918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882547.61958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882547.61972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882547.62259: variable '__network_required_facts' from source: role '' defaults 28011 1726882547.62272: variable 'ansible_facts' from source: unknown 28011 1726882547.63099: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28011 1726882547.63107: when evaluation is False, skipping this task 28011 1726882547.63110: _execute() done 28011 1726882547.63113: dumping result to json 28011 1726882547.63115: done dumping result, returning 28011 1726882547.63117: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-962d-7c65-0000000005df] 28011 1726882547.63119: sending task result for task 12673a56-9f93-962d-7c65-0000000005df 28011 1726882547.63333: done sending task result for task 12673a56-9f93-962d-7c65-0000000005df 28011 1726882547.63336: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882547.63373: no more pending results, returning what we have 28011 1726882547.63376: results queue empty 28011 1726882547.63376: checking for any_errors_fatal 28011 1726882547.63378: done checking for any_errors_fatal 28011 1726882547.63378: checking for max_fail_percentage 28011 1726882547.63380: done checking for max_fail_percentage 28011 1726882547.63381: checking to see if all hosts have failed and the running result is not ok 28011 1726882547.63382: done checking to see if all hosts have failed 28011 1726882547.63383: getting the remaining hosts for this loop 28011 1726882547.63384: done getting the remaining hosts for this loop 28011 1726882547.63387: getting the next task for host managed_node1 28011 1726882547.63397: done getting next task for host managed_node1 28011 1726882547.63400: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28011 1726882547.63404: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882547.63422: getting variables 28011 1726882547.63423: in VariableManager get_vars() 28011 1726882547.63465: Calling all_inventory to load vars for managed_node1 28011 1726882547.63469: Calling groups_inventory to load vars for managed_node1 28011 1726882547.63471: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882547.63480: Calling all_plugins_play to load vars for managed_node1 28011 1726882547.63483: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882547.63486: Calling groups_plugins_play to load vars for managed_node1 28011 1726882547.64944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882547.67146: done with get_vars() 28011 1726882547.67179: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:35:47 -0400 (0:00:00.133) 0:00:17.224 ****** 28011 1726882547.67311: entering _queue_task() for managed_node1/stat 28011 1726882547.67759: worker is 1 (out of 1 available) 28011 1726882547.67772: exiting _queue_task() for managed_node1/stat 28011 1726882547.67792: done queuing things up, now waiting for results queue to drain 28011 1726882547.67796: waiting for pending results... 28011 1726882547.68703: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 28011 1726882547.69443: in run() - task 12673a56-9f93-962d-7c65-0000000005e1 28011 1726882547.69595: variable 'ansible_search_path' from source: unknown 28011 1726882547.69602: variable 'ansible_search_path' from source: unknown 28011 1726882547.69736: calling self._execute() 28011 1726882547.69872: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882547.69876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882547.69879: variable 'omit' from source: magic vars 28011 1726882547.70336: variable 'ansible_distribution_major_version' from source: facts 28011 1726882547.70348: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882547.70547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882547.70873: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882547.70887: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882547.70945: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882547.71041: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882547.71180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882547.71214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882547.71332: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882547.71420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882547.71662: variable '__network_is_ostree' from source: set_fact 28011 1726882547.71665: Evaluated conditional (not __network_is_ostree is defined): False 28011 1726882547.71668: when evaluation is False, skipping this task 28011 1726882547.71669: _execute() done 28011 1726882547.71671: dumping result to json 28011 1726882547.71672: done dumping result, returning 28011 1726882547.71674: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-962d-7c65-0000000005e1] 28011 1726882547.71676: sending task result for task 12673a56-9f93-962d-7c65-0000000005e1 28011 1726882547.71752: done sending task result for task 12673a56-9f93-962d-7c65-0000000005e1 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28011 1726882547.71812: no more pending results, returning what we have 28011 1726882547.71815: results queue empty 28011 1726882547.71816: checking for any_errors_fatal 28011 1726882547.71824: done checking for any_errors_fatal 28011 1726882547.71825: checking for max_fail_percentage 28011 1726882547.71827: done checking for max_fail_percentage 28011 1726882547.71828: checking to see if all hosts have failed and the running result is not ok 28011 1726882547.71829: done checking to see if all hosts have failed 28011 1726882547.71830: getting the remaining hosts for this loop 28011 1726882547.71831: done getting the remaining hosts for this loop 28011 1726882547.71835: getting the next task for host managed_node1 28011 1726882547.71843: done getting next task for host managed_node1 28011 1726882547.71852: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28011 1726882547.72028: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882547.72046: getting variables 28011 1726882547.72047: in VariableManager get_vars() 28011 1726882547.72092: Calling all_inventory to load vars for managed_node1 28011 1726882547.72170: Calling groups_inventory to load vars for managed_node1 28011 1726882547.72173: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882547.72322: Calling all_plugins_play to load vars for managed_node1 28011 1726882547.72326: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882547.72333: Calling groups_plugins_play to load vars for managed_node1 28011 1726882547.73554: WORKER PROCESS EXITING 28011 1726882547.74054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882547.75440: done with get_vars() 28011 1726882547.75463: done getting variables 28011 1726882547.75528: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:35:47 -0400 (0:00:00.082) 0:00:17.306 ****** 28011 1726882547.75564: entering _queue_task() for managed_node1/set_fact 28011 1726882547.76048: worker is 1 (out of 1 available) 28011 1726882547.76060: exiting _queue_task() for managed_node1/set_fact 28011 1726882547.76129: done queuing things up, now waiting for results queue to drain 28011 1726882547.76131: waiting for pending results... 28011 1726882547.76670: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28011 1726882547.76972: in run() - task 12673a56-9f93-962d-7c65-0000000005e2 28011 1726882547.76994: variable 'ansible_search_path' from source: unknown 28011 1726882547.77003: variable 'ansible_search_path' from source: unknown 28011 1726882547.77080: calling self._execute() 28011 1726882547.77148: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882547.77161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882547.77178: variable 'omit' from source: magic vars 28011 1726882547.77552: variable 'ansible_distribution_major_version' from source: facts 28011 1726882547.77580: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882547.77694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882547.77905: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882547.77951: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882547.77975: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882547.78018: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882547.78133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882547.78159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882547.78165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882547.78228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882547.78277: variable '__network_is_ostree' from source: set_fact 28011 1726882547.78299: Evaluated conditional (not __network_is_ostree is defined): False 28011 1726882547.78302: when evaluation is False, skipping this task 28011 1726882547.78305: _execute() done 28011 1726882547.78307: dumping result to json 28011 1726882547.78310: done dumping result, returning 28011 1726882547.78324: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-962d-7c65-0000000005e2] 28011 1726882547.78339: sending task result for task 12673a56-9f93-962d-7c65-0000000005e2 28011 1726882547.78510: done sending task result for task 12673a56-9f93-962d-7c65-0000000005e2 28011 1726882547.78513: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28011 1726882547.78569: no more pending results, returning what we have 28011 1726882547.78572: results queue empty 28011 1726882547.78573: checking for any_errors_fatal 28011 1726882547.78580: done checking for any_errors_fatal 28011 1726882547.78580: checking for max_fail_percentage 28011 1726882547.78582: done checking for max_fail_percentage 28011 1726882547.78583: checking to see if all hosts have failed and the running result is not ok 28011 1726882547.78584: done checking to see if all hosts have failed 28011 1726882547.78584: getting the remaining hosts for this loop 28011 1726882547.78585: done getting the remaining hosts for this loop 28011 1726882547.78589: getting the next task for host managed_node1 28011 1726882547.78598: done getting next task for host managed_node1 28011 1726882547.78602: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28011 1726882547.78606: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882547.78690: getting variables 28011 1726882547.78692: in VariableManager get_vars() 28011 1726882547.78749: Calling all_inventory to load vars for managed_node1 28011 1726882547.78760: Calling groups_inventory to load vars for managed_node1 28011 1726882547.78764: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882547.78773: Calling all_plugins_play to load vars for managed_node1 28011 1726882547.78776: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882547.78847: Calling groups_plugins_play to load vars for managed_node1 28011 1726882547.80437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882547.83202: done with get_vars() 28011 1726882547.83228: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:35:47 -0400 (0:00:00.077) 0:00:17.384 ****** 28011 1726882547.83320: entering _queue_task() for managed_node1/service_facts 28011 1726882547.83661: worker is 1 (out of 1 available) 28011 1726882547.83675: exiting _queue_task() for managed_node1/service_facts 28011 1726882547.83689: done queuing things up, now waiting for results queue to drain 28011 1726882547.83691: waiting for pending results... 28011 1726882547.84333: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 28011 1726882547.84338: in run() - task 12673a56-9f93-962d-7c65-0000000005e4 28011 1726882547.84341: variable 'ansible_search_path' from source: unknown 28011 1726882547.84343: variable 'ansible_search_path' from source: unknown 28011 1726882547.84405: calling self._execute() 28011 1726882547.84529: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882547.84533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882547.84545: variable 'omit' from source: magic vars 28011 1726882547.85044: variable 'ansible_distribution_major_version' from source: facts 28011 1726882547.85061: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882547.85064: variable 'omit' from source: magic vars 28011 1726882547.85147: variable 'omit' from source: magic vars 28011 1726882547.85184: variable 'omit' from source: magic vars 28011 1726882547.85225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882547.85264: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882547.85285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882547.85304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882547.85319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882547.85383: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882547.85386: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882547.85391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882547.85562: Set connection var ansible_connection to ssh 28011 1726882547.85575: Set connection var ansible_pipelining to False 28011 1726882547.85581: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882547.85587: Set connection var ansible_shell_executable to /bin/sh 28011 1726882547.85596: Set connection var ansible_timeout to 10 28011 1726882547.85602: Set connection var ansible_shell_type to sh 28011 1726882547.85639: variable 'ansible_shell_executable' from source: unknown 28011 1726882547.85643: variable 'ansible_connection' from source: unknown 28011 1726882547.85646: variable 'ansible_module_compression' from source: unknown 28011 1726882547.85648: variable 'ansible_shell_type' from source: unknown 28011 1726882547.85650: variable 'ansible_shell_executable' from source: unknown 28011 1726882547.85653: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882547.85655: variable 'ansible_pipelining' from source: unknown 28011 1726882547.85669: variable 'ansible_timeout' from source: unknown 28011 1726882547.85672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882547.86006: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882547.86017: variable 'omit' from source: magic vars 28011 1726882547.86020: starting attempt loop 28011 1726882547.86055: running the handler 28011 1726882547.86058: _low_level_execute_command(): starting 28011 1726882547.86061: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882547.86927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882547.86933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882547.87051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882547.87055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882547.87058: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882547.87101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882547.87105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882547.87167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882547.87306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882547.88955: stdout chunk (state=3): >>>/root <<< 28011 1726882547.89198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882547.89201: stdout chunk (state=3): >>><<< 28011 1726882547.89203: stderr chunk (state=3): >>><<< 28011 1726882547.89206: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882547.89209: _low_level_execute_command(): starting 28011 1726882547.89211: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609 `" && echo ansible-tmp-1726882547.891394-28853-162608014476609="` echo /root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609 `" ) && sleep 0' 28011 1726882547.89803: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882547.89812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882547.89823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882547.89847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882547.89909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882547.89959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882547.89972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882547.89988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882547.90096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882547.92100: stdout chunk (state=3): >>>ansible-tmp-1726882547.891394-28853-162608014476609=/root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609 <<< 28011 1726882547.92110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882547.92223: stderr chunk (state=3): >>><<< 28011 1726882547.92226: stdout chunk (state=3): >>><<< 28011 1726882547.92275: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882547.891394-28853-162608014476609=/root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882547.92322: variable 'ansible_module_compression' from source: unknown 28011 1726882547.92379: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28011 1726882547.92426: variable 'ansible_facts' from source: unknown 28011 1726882547.92552: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609/AnsiballZ_service_facts.py 28011 1726882547.92960: Sending initial data 28011 1726882547.92963: Sent initial data (161 bytes) 28011 1726882547.93960: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882547.94216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882547.94351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882547.95861: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882547.95923: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882547.95983: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp5wy2_2r1 /root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609/AnsiballZ_service_facts.py <<< 28011 1726882547.95996: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609/AnsiballZ_service_facts.py" <<< 28011 1726882547.96044: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 28011 1726882547.96059: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp5wy2_2r1" to remote "/root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609/AnsiballZ_service_facts.py" <<< 28011 1726882547.96932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882547.97143: stderr chunk (state=3): >>><<< 28011 1726882547.97147: stdout chunk (state=3): >>><<< 28011 1726882547.97413: done transferring module to remote 28011 1726882547.97424: _low_level_execute_command(): starting 28011 1726882547.97428: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609/ /root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609/AnsiballZ_service_facts.py && sleep 0' 28011 1726882547.98654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882547.98659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882547.98673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882547.98684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882547.98751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882548.00481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882548.00486: stderr chunk (state=3): >>><<< 28011 1726882548.00491: stdout chunk (state=3): >>><<< 28011 1726882548.00624: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882548.00631: _low_level_execute_command(): starting 28011 1726882548.00637: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609/AnsiballZ_service_facts.py && sleep 0' 28011 1726882548.01510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882548.01529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882548.01535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882548.01552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882548.01558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882548.01631: stderr chunk (state=3): >>>debug2: match found <<< 28011 1726882548.01657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882548.01710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882549.52389: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 28011 1726882549.52401: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 28011 1726882549.52440: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 28011 1726882549.52490: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 28011 1726882549.52532: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28011 1726882549.53947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882549.54014: stderr chunk (state=3): >>><<< 28011 1726882549.54018: stdout chunk (state=3): >>><<< 28011 1726882549.54038: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882549.54649: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882549.54653: _low_level_execute_command(): starting 28011 1726882549.54656: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882547.891394-28853-162608014476609/ > /dev/null 2>&1 && sleep 0' 28011 1726882549.55256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882549.55261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882549.55296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882549.55299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882549.55330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882549.55363: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882549.55366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882549.55387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882549.55417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882549.55462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882549.57262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882549.57276: stderr chunk (state=3): >>><<< 28011 1726882549.57279: stdout chunk (state=3): >>><<< 28011 1726882549.57292: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882549.57303: handler run complete 28011 1726882549.57439: variable 'ansible_facts' from source: unknown 28011 1726882549.57532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882549.57901: variable 'ansible_facts' from source: unknown 28011 1726882549.58000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882549.58135: attempt loop complete, returning result 28011 1726882549.58138: _execute() done 28011 1726882549.58141: dumping result to json 28011 1726882549.58177: done dumping result, returning 28011 1726882549.58185: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-962d-7c65-0000000005e4] 28011 1726882549.58188: sending task result for task 12673a56-9f93-962d-7c65-0000000005e4 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882549.58855: no more pending results, returning what we have 28011 1726882549.58858: results queue empty 28011 1726882549.58858: checking for any_errors_fatal 28011 1726882549.58864: done checking for any_errors_fatal 28011 1726882549.58865: checking for max_fail_percentage 28011 1726882549.58866: done checking for max_fail_percentage 28011 1726882549.58867: checking to see if all hosts have failed and the running result is not ok 28011 1726882549.58868: done checking to see if all hosts have failed 28011 1726882549.58868: getting the remaining hosts for this loop 28011 1726882549.58869: done getting the remaining hosts for this loop 28011 1726882549.58873: getting the next task for host managed_node1 28011 1726882549.58877: done getting next task for host managed_node1 28011 1726882549.58880: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28011 1726882549.58883: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882549.58895: done sending task result for task 12673a56-9f93-962d-7c65-0000000005e4 28011 1726882549.58898: WORKER PROCESS EXITING 28011 1726882549.58904: getting variables 28011 1726882549.58905: in VariableManager get_vars() 28011 1726882549.58936: Calling all_inventory to load vars for managed_node1 28011 1726882549.58937: Calling groups_inventory to load vars for managed_node1 28011 1726882549.58939: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882549.58945: Calling all_plugins_play to load vars for managed_node1 28011 1726882549.58947: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882549.58948: Calling groups_plugins_play to load vars for managed_node1 28011 1726882549.60099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882549.61140: done with get_vars() 28011 1726882549.61156: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:35:49 -0400 (0:00:01.779) 0:00:19.163 ****** 28011 1726882549.61230: entering _queue_task() for managed_node1/package_facts 28011 1726882549.61467: worker is 1 (out of 1 available) 28011 1726882549.61480: exiting _queue_task() for managed_node1/package_facts 28011 1726882549.61497: done queuing things up, now waiting for results queue to drain 28011 1726882549.61499: waiting for pending results... 28011 1726882549.61672: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 28011 1726882549.61773: in run() - task 12673a56-9f93-962d-7c65-0000000005e5 28011 1726882549.61785: variable 'ansible_search_path' from source: unknown 28011 1726882549.61791: variable 'ansible_search_path' from source: unknown 28011 1726882549.61818: calling self._execute() 28011 1726882549.61894: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882549.61899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882549.61907: variable 'omit' from source: magic vars 28011 1726882549.62203: variable 'ansible_distribution_major_version' from source: facts 28011 1726882549.62220: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882549.62223: variable 'omit' from source: magic vars 28011 1726882549.62280: variable 'omit' from source: magic vars 28011 1726882549.62321: variable 'omit' from source: magic vars 28011 1726882549.62351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882549.62384: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882549.62397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882549.62412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882549.62428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882549.62455: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882549.62458: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882549.62460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882549.62533: Set connection var ansible_connection to ssh 28011 1726882549.62539: Set connection var ansible_pipelining to False 28011 1726882549.62558: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882549.62561: Set connection var ansible_shell_executable to /bin/sh 28011 1726882549.62583: Set connection var ansible_timeout to 10 28011 1726882549.62586: Set connection var ansible_shell_type to sh 28011 1726882549.62608: variable 'ansible_shell_executable' from source: unknown 28011 1726882549.62611: variable 'ansible_connection' from source: unknown 28011 1726882549.62614: variable 'ansible_module_compression' from source: unknown 28011 1726882549.62616: variable 'ansible_shell_type' from source: unknown 28011 1726882549.62619: variable 'ansible_shell_executable' from source: unknown 28011 1726882549.62622: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882549.62624: variable 'ansible_pipelining' from source: unknown 28011 1726882549.62625: variable 'ansible_timeout' from source: unknown 28011 1726882549.62627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882549.62777: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882549.62785: variable 'omit' from source: magic vars 28011 1726882549.62792: starting attempt loop 28011 1726882549.62796: running the handler 28011 1726882549.62817: _low_level_execute_command(): starting 28011 1726882549.62840: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882549.63460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882549.63506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882549.63510: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882549.63558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882549.63609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882549.63661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882549.65215: stdout chunk (state=3): >>>/root <<< 28011 1726882549.65332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882549.65364: stderr chunk (state=3): >>><<< 28011 1726882549.65367: stdout chunk (state=3): >>><<< 28011 1726882549.65386: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882549.65404: _low_level_execute_command(): starting 28011 1726882549.65412: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161 `" && echo ansible-tmp-1726882549.6538787-28918-66457937837161="` echo /root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161 `" ) && sleep 0' 28011 1726882549.66018: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882549.66050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882549.66062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882549.66102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882549.66106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882549.66140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882549.66219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882549.68229: stdout chunk (state=3): >>>ansible-tmp-1726882549.6538787-28918-66457937837161=/root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161 <<< 28011 1726882549.68234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882549.68330: stderr chunk (state=3): >>><<< 28011 1726882549.68334: stdout chunk (state=3): >>><<< 28011 1726882549.68356: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882549.6538787-28918-66457937837161=/root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882549.68424: variable 'ansible_module_compression' from source: unknown 28011 1726882549.68497: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28011 1726882549.68649: variable 'ansible_facts' from source: unknown 28011 1726882549.68797: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161/AnsiballZ_package_facts.py 28011 1726882549.69020: Sending initial data 28011 1726882549.69024: Sent initial data (161 bytes) 28011 1726882549.70024: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882549.70053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882549.70069: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882549.70101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882549.70390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882549.72105: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882549.72133: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882549.72188: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmptki7du_v /root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161/AnsiballZ_package_facts.py <<< 28011 1726882549.72191: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161/AnsiballZ_package_facts.py" <<< 28011 1726882549.72231: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmptki7du_v" to remote "/root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161/AnsiballZ_package_facts.py" <<< 28011 1726882549.74301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882549.74411: stderr chunk (state=3): >>><<< 28011 1726882549.74415: stdout chunk (state=3): >>><<< 28011 1726882549.74475: done transferring module to remote 28011 1726882549.74486: _low_level_execute_command(): starting 28011 1726882549.74492: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161/ /root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161/AnsiballZ_package_facts.py && sleep 0' 28011 1726882549.75209: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882549.75220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882549.75231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882549.75256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882549.75320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882549.77026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882549.77047: stderr chunk (state=3): >>><<< 28011 1726882549.77050: stdout chunk (state=3): >>><<< 28011 1726882549.77062: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882549.77064: _low_level_execute_command(): starting 28011 1726882549.77069: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161/AnsiballZ_package_facts.py && sleep 0' 28011 1726882549.77458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882549.77485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882549.77488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882549.77490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882549.77492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882549.77496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882549.77552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882549.77555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882549.77601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882550.21577: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "ar<<< 28011 1726882550.21604: stdout chunk (state=3): >>>ch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap":<<< 28011 1726882550.21713: stdout chunk (state=3): >>> [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18"<<< 28011 1726882550.21727: stdout chunk (state=3): >>>, "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "<<< 28011 1726882550.21730: stdout chunk (state=3): >>>epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.1<<< 28011 1726882550.21749: stdout chunk (state=3): >>>9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28011 1726882550.23716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882550.23719: stdout chunk (state=3): >>><<< 28011 1726882550.23722: stderr chunk (state=3): >>><<< 28011 1726882550.23806: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882550.28824: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882550.28828: _low_level_execute_command(): starting 28011 1726882550.28830: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882549.6538787-28918-66457937837161/ > /dev/null 2>&1 && sleep 0' 28011 1726882550.29732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882550.29819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882550.30036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882550.30043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882550.30047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882550.30109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882550.32206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882550.32210: stdout chunk (state=3): >>><<< 28011 1726882550.32212: stderr chunk (state=3): >>><<< 28011 1726882550.32215: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882550.32217: handler run complete 28011 1726882550.34104: variable 'ansible_facts' from source: unknown 28011 1726882550.34955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882550.38270: variable 'ansible_facts' from source: unknown 28011 1726882550.38784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882550.39746: attempt loop complete, returning result 28011 1726882550.39773: _execute() done 28011 1726882550.39784: dumping result to json 28011 1726882550.40043: done dumping result, returning 28011 1726882550.40100: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-962d-7c65-0000000005e5] 28011 1726882550.40105: sending task result for task 12673a56-9f93-962d-7c65-0000000005e5 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882550.42497: done sending task result for task 12673a56-9f93-962d-7c65-0000000005e5 28011 1726882550.42504: WORKER PROCESS EXITING 28011 1726882550.42516: no more pending results, returning what we have 28011 1726882550.42518: results queue empty 28011 1726882550.42519: checking for any_errors_fatal 28011 1726882550.42530: done checking for any_errors_fatal 28011 1726882550.42531: checking for max_fail_percentage 28011 1726882550.42533: done checking for max_fail_percentage 28011 1726882550.42534: checking to see if all hosts have failed and the running result is not ok 28011 1726882550.42535: done checking to see if all hosts have failed 28011 1726882550.42535: getting the remaining hosts for this loop 28011 1726882550.42537: done getting the remaining hosts for this loop 28011 1726882550.42540: getting the next task for host managed_node1 28011 1726882550.42546: done getting next task for host managed_node1 28011 1726882550.42550: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28011 1726882550.42552: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882550.42563: getting variables 28011 1726882550.42564: in VariableManager get_vars() 28011 1726882550.42599: Calling all_inventory to load vars for managed_node1 28011 1726882550.42601: Calling groups_inventory to load vars for managed_node1 28011 1726882550.42603: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882550.42611: Calling all_plugins_play to load vars for managed_node1 28011 1726882550.42613: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882550.42616: Calling groups_plugins_play to load vars for managed_node1 28011 1726882550.43504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882550.44969: done with get_vars() 28011 1726882550.44986: done getting variables 28011 1726882550.45031: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:35:50 -0400 (0:00:00.838) 0:00:20.001 ****** 28011 1726882550.45055: entering _queue_task() for managed_node1/debug 28011 1726882550.45312: worker is 1 (out of 1 available) 28011 1726882550.45326: exiting _queue_task() for managed_node1/debug 28011 1726882550.45338: done queuing things up, now waiting for results queue to drain 28011 1726882550.45340: waiting for pending results... 28011 1726882550.45525: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 28011 1726882550.45614: in run() - task 12673a56-9f93-962d-7c65-000000000068 28011 1726882550.45626: variable 'ansible_search_path' from source: unknown 28011 1726882550.45629: variable 'ansible_search_path' from source: unknown 28011 1726882550.45660: calling self._execute() 28011 1726882550.45734: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882550.45738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882550.46098: variable 'omit' from source: magic vars 28011 1726882550.46122: variable 'ansible_distribution_major_version' from source: facts 28011 1726882550.46139: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882550.46150: variable 'omit' from source: magic vars 28011 1726882550.46212: variable 'omit' from source: magic vars 28011 1726882550.46314: variable 'network_provider' from source: set_fact 28011 1726882550.46337: variable 'omit' from source: magic vars 28011 1726882550.46384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882550.46432: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882550.46461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882550.46484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882550.46507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882550.46545: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882550.46554: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882550.46563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882550.46665: Set connection var ansible_connection to ssh 28011 1726882550.46680: Set connection var ansible_pipelining to False 28011 1726882550.46692: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882550.46705: Set connection var ansible_shell_executable to /bin/sh 28011 1726882550.46897: Set connection var ansible_timeout to 10 28011 1726882550.46901: Set connection var ansible_shell_type to sh 28011 1726882550.46903: variable 'ansible_shell_executable' from source: unknown 28011 1726882550.46905: variable 'ansible_connection' from source: unknown 28011 1726882550.46907: variable 'ansible_module_compression' from source: unknown 28011 1726882550.46909: variable 'ansible_shell_type' from source: unknown 28011 1726882550.46911: variable 'ansible_shell_executable' from source: unknown 28011 1726882550.46913: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882550.46915: variable 'ansible_pipelining' from source: unknown 28011 1726882550.46917: variable 'ansible_timeout' from source: unknown 28011 1726882550.46919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882550.46921: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882550.46937: variable 'omit' from source: magic vars 28011 1726882550.46945: starting attempt loop 28011 1726882550.46952: running the handler 28011 1726882550.47001: handler run complete 28011 1726882550.47020: attempt loop complete, returning result 28011 1726882550.47027: _execute() done 28011 1726882550.47034: dumping result to json 28011 1726882550.47041: done dumping result, returning 28011 1726882550.47058: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-962d-7c65-000000000068] 28011 1726882550.47082: sending task result for task 12673a56-9f93-962d-7c65-000000000068 ok: [managed_node1] => {} MSG: Using network provider: nm 28011 1726882550.47296: no more pending results, returning what we have 28011 1726882550.47300: results queue empty 28011 1726882550.47301: checking for any_errors_fatal 28011 1726882550.47333: done checking for any_errors_fatal 28011 1726882550.47335: checking for max_fail_percentage 28011 1726882550.47336: done checking for max_fail_percentage 28011 1726882550.47337: checking to see if all hosts have failed and the running result is not ok 28011 1726882550.47338: done checking to see if all hosts have failed 28011 1726882550.47339: getting the remaining hosts for this loop 28011 1726882550.47341: done getting the remaining hosts for this loop 28011 1726882550.47344: getting the next task for host managed_node1 28011 1726882550.47382: done getting next task for host managed_node1 28011 1726882550.47386: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28011 1726882550.47389: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882550.47479: getting variables 28011 1726882550.47482: in VariableManager get_vars() 28011 1726882550.47564: Calling all_inventory to load vars for managed_node1 28011 1726882550.47566: Calling groups_inventory to load vars for managed_node1 28011 1726882550.47568: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882550.47579: done sending task result for task 12673a56-9f93-962d-7c65-000000000068 28011 1726882550.47582: WORKER PROCESS EXITING 28011 1726882550.47595: Calling all_plugins_play to load vars for managed_node1 28011 1726882550.47599: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882550.47625: Calling groups_plugins_play to load vars for managed_node1 28011 1726882550.49223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882550.50881: done with get_vars() 28011 1726882550.50922: done getting variables 28011 1726882550.50998: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:35:50 -0400 (0:00:00.059) 0:00:20.061 ****** 28011 1726882550.51042: entering _queue_task() for managed_node1/fail 28011 1726882550.51405: worker is 1 (out of 1 available) 28011 1726882550.51418: exiting _queue_task() for managed_node1/fail 28011 1726882550.51430: done queuing things up, now waiting for results queue to drain 28011 1726882550.51431: waiting for pending results... 28011 1726882550.51721: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28011 1726882550.51856: in run() - task 12673a56-9f93-962d-7c65-000000000069 28011 1726882550.51875: variable 'ansible_search_path' from source: unknown 28011 1726882550.51881: variable 'ansible_search_path' from source: unknown 28011 1726882550.51925: calling self._execute() 28011 1726882550.52031: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882550.52041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882550.52058: variable 'omit' from source: magic vars 28011 1726882550.52461: variable 'ansible_distribution_major_version' from source: facts 28011 1726882550.52484: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882550.52622: variable 'network_state' from source: role '' defaults 28011 1726882550.52636: Evaluated conditional (network_state != {}): False 28011 1726882550.52642: when evaluation is False, skipping this task 28011 1726882550.52648: _execute() done 28011 1726882550.52653: dumping result to json 28011 1726882550.52659: done dumping result, returning 28011 1726882550.52668: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-962d-7c65-000000000069] 28011 1726882550.52676: sending task result for task 12673a56-9f93-962d-7c65-000000000069 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882550.52855: no more pending results, returning what we have 28011 1726882550.52858: results queue empty 28011 1726882550.52860: checking for any_errors_fatal 28011 1726882550.52873: done checking for any_errors_fatal 28011 1726882550.52874: checking for max_fail_percentage 28011 1726882550.52876: done checking for max_fail_percentage 28011 1726882550.52876: checking to see if all hosts have failed and the running result is not ok 28011 1726882550.52877: done checking to see if all hosts have failed 28011 1726882550.52878: getting the remaining hosts for this loop 28011 1726882550.52879: done getting the remaining hosts for this loop 28011 1726882550.52883: getting the next task for host managed_node1 28011 1726882550.52889: done getting next task for host managed_node1 28011 1726882550.52895: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28011 1726882550.52898: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882550.52918: getting variables 28011 1726882550.52920: in VariableManager get_vars() 28011 1726882550.52957: Calling all_inventory to load vars for managed_node1 28011 1726882550.52960: Calling groups_inventory to load vars for managed_node1 28011 1726882550.52962: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882550.52972: Calling all_plugins_play to load vars for managed_node1 28011 1726882550.52974: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882550.52977: Calling groups_plugins_play to load vars for managed_node1 28011 1726882550.53608: done sending task result for task 12673a56-9f93-962d-7c65-000000000069 28011 1726882550.53611: WORKER PROCESS EXITING 28011 1726882550.54604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882550.56158: done with get_vars() 28011 1726882550.56189: done getting variables 28011 1726882550.56266: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:35:50 -0400 (0:00:00.052) 0:00:20.114 ****** 28011 1726882550.56309: entering _queue_task() for managed_node1/fail 28011 1726882550.56720: worker is 1 (out of 1 available) 28011 1726882550.56735: exiting _queue_task() for managed_node1/fail 28011 1726882550.56749: done queuing things up, now waiting for results queue to drain 28011 1726882550.56751: waiting for pending results... 28011 1726882550.57071: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28011 1726882550.57267: in run() - task 12673a56-9f93-962d-7c65-00000000006a 28011 1726882550.57296: variable 'ansible_search_path' from source: unknown 28011 1726882550.57307: variable 'ansible_search_path' from source: unknown 28011 1726882550.57352: calling self._execute() 28011 1726882550.57463: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882550.57482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882550.57509: variable 'omit' from source: magic vars 28011 1726882550.57945: variable 'ansible_distribution_major_version' from source: facts 28011 1726882550.57962: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882550.58102: variable 'network_state' from source: role '' defaults 28011 1726882550.58117: Evaluated conditional (network_state != {}): False 28011 1726882550.58124: when evaluation is False, skipping this task 28011 1726882550.58130: _execute() done 28011 1726882550.58137: dumping result to json 28011 1726882550.58146: done dumping result, returning 28011 1726882550.58159: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-962d-7c65-00000000006a] 28011 1726882550.58169: sending task result for task 12673a56-9f93-962d-7c65-00000000006a skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882550.58440: no more pending results, returning what we have 28011 1726882550.58444: results queue empty 28011 1726882550.58445: checking for any_errors_fatal 28011 1726882550.58452: done checking for any_errors_fatal 28011 1726882550.58453: checking for max_fail_percentage 28011 1726882550.58455: done checking for max_fail_percentage 28011 1726882550.58456: checking to see if all hosts have failed and the running result is not ok 28011 1726882550.58457: done checking to see if all hosts have failed 28011 1726882550.58457: getting the remaining hosts for this loop 28011 1726882550.58459: done getting the remaining hosts for this loop 28011 1726882550.58463: getting the next task for host managed_node1 28011 1726882550.58469: done getting next task for host managed_node1 28011 1726882550.58473: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28011 1726882550.58476: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882550.58504: getting variables 28011 1726882550.58507: in VariableManager get_vars() 28011 1726882550.58548: Calling all_inventory to load vars for managed_node1 28011 1726882550.58551: Calling groups_inventory to load vars for managed_node1 28011 1726882550.58554: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882550.58566: Calling all_plugins_play to load vars for managed_node1 28011 1726882550.58569: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882550.58572: Calling groups_plugins_play to load vars for managed_node1 28011 1726882550.59114: done sending task result for task 12673a56-9f93-962d-7c65-00000000006a 28011 1726882550.59120: WORKER PROCESS EXITING 28011 1726882550.60138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882550.61712: done with get_vars() 28011 1726882550.61737: done getting variables 28011 1726882550.61810: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:35:50 -0400 (0:00:00.055) 0:00:20.169 ****** 28011 1726882550.61849: entering _queue_task() for managed_node1/fail 28011 1726882550.62200: worker is 1 (out of 1 available) 28011 1726882550.62212: exiting _queue_task() for managed_node1/fail 28011 1726882550.62224: done queuing things up, now waiting for results queue to drain 28011 1726882550.62230: waiting for pending results... 28011 1726882550.62557: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28011 1726882550.62730: in run() - task 12673a56-9f93-962d-7c65-00000000006b 28011 1726882550.62756: variable 'ansible_search_path' from source: unknown 28011 1726882550.62768: variable 'ansible_search_path' from source: unknown 28011 1726882550.62817: calling self._execute() 28011 1726882550.62926: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882550.62953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882550.62973: variable 'omit' from source: magic vars 28011 1726882550.63398: variable 'ansible_distribution_major_version' from source: facts 28011 1726882550.63417: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882550.63617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882550.66464: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882550.66540: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882550.66586: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882550.66634: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882550.66672: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882550.66747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882550.66768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882550.66785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.66827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882550.66833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882550.66909: variable 'ansible_distribution_major_version' from source: facts 28011 1726882550.66923: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28011 1726882550.67011: variable 'ansible_distribution' from source: facts 28011 1726882550.67015: variable '__network_rh_distros' from source: role '' defaults 28011 1726882550.67020: Evaluated conditional (ansible_distribution in __network_rh_distros): True 28011 1726882550.67263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882550.67298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882550.67332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.67360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882550.67370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882550.67434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882550.67455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882550.67501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.67532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882550.67535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882550.67585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882550.67608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882550.67628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.67663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882550.67674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882550.67946: variable 'network_connections' from source: task vars 28011 1726882550.67951: variable 'interface' from source: set_fact 28011 1726882550.68013: variable 'interface' from source: set_fact 28011 1726882550.68022: variable 'interface' from source: set_fact 28011 1726882550.68095: variable 'interface' from source: set_fact 28011 1726882550.68099: variable 'network_state' from source: role '' defaults 28011 1726882550.68144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882550.68314: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882550.68415: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882550.68423: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882550.68426: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882550.68495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882550.68506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882550.68677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.68680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882550.68682: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 28011 1726882550.68684: when evaluation is False, skipping this task 28011 1726882550.68685: _execute() done 28011 1726882550.68687: dumping result to json 28011 1726882550.68689: done dumping result, returning 28011 1726882550.68691: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-962d-7c65-00000000006b] 28011 1726882550.68694: sending task result for task 12673a56-9f93-962d-7c65-00000000006b 28011 1726882550.68754: done sending task result for task 12673a56-9f93-962d-7c65-00000000006b 28011 1726882550.68757: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 28011 1726882550.68834: no more pending results, returning what we have 28011 1726882550.68837: results queue empty 28011 1726882550.68838: checking for any_errors_fatal 28011 1726882550.68847: done checking for any_errors_fatal 28011 1726882550.68848: checking for max_fail_percentage 28011 1726882550.68851: done checking for max_fail_percentage 28011 1726882550.68851: checking to see if all hosts have failed and the running result is not ok 28011 1726882550.68852: done checking to see if all hosts have failed 28011 1726882550.68853: getting the remaining hosts for this loop 28011 1726882550.68855: done getting the remaining hosts for this loop 28011 1726882550.68858: getting the next task for host managed_node1 28011 1726882550.68866: done getting next task for host managed_node1 28011 1726882550.68870: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28011 1726882550.68873: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882550.68901: getting variables 28011 1726882550.68903: in VariableManager get_vars() 28011 1726882550.68952: Calling all_inventory to load vars for managed_node1 28011 1726882550.68956: Calling groups_inventory to load vars for managed_node1 28011 1726882550.68958: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882550.68970: Calling all_plugins_play to load vars for managed_node1 28011 1726882550.68972: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882550.68975: Calling groups_plugins_play to load vars for managed_node1 28011 1726882550.70607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882550.71555: done with get_vars() 28011 1726882550.71571: done getting variables 28011 1726882550.71615: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:35:50 -0400 (0:00:00.097) 0:00:20.267 ****** 28011 1726882550.71637: entering _queue_task() for managed_node1/dnf 28011 1726882550.71907: worker is 1 (out of 1 available) 28011 1726882550.71919: exiting _queue_task() for managed_node1/dnf 28011 1726882550.71931: done queuing things up, now waiting for results queue to drain 28011 1726882550.71932: waiting for pending results... 28011 1726882550.72435: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28011 1726882550.72444: in run() - task 12673a56-9f93-962d-7c65-00000000006c 28011 1726882550.72447: variable 'ansible_search_path' from source: unknown 28011 1726882550.72450: variable 'ansible_search_path' from source: unknown 28011 1726882550.72453: calling self._execute() 28011 1726882550.72582: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882550.72621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882550.72627: variable 'omit' from source: magic vars 28011 1726882550.73278: variable 'ansible_distribution_major_version' from source: facts 28011 1726882550.73281: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882550.73539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882550.78439: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882550.78488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882550.78585: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882550.78590: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882550.78595: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882550.78657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882550.78687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882550.78710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.78752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882550.78763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882550.78876: variable 'ansible_distribution' from source: facts 28011 1726882550.78880: variable 'ansible_distribution_major_version' from source: facts 28011 1726882550.78999: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28011 1726882550.79021: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882550.79144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882550.79167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882550.79195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.79231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882550.79246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882550.79287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882550.79312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882550.79334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.79370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882550.79382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882550.79420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882550.79444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882550.79464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.79501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882550.79514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882550.79668: variable 'network_connections' from source: task vars 28011 1726882550.79699: variable 'interface' from source: set_fact 28011 1726882550.79739: variable 'interface' from source: set_fact 28011 1726882550.79750: variable 'interface' from source: set_fact 28011 1726882550.79885: variable 'interface' from source: set_fact 28011 1726882550.79888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882550.80083: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882550.80114: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882550.80137: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882550.80158: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882550.80196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882550.80211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882550.80235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.80252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882550.80298: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882550.80446: variable 'network_connections' from source: task vars 28011 1726882550.80450: variable 'interface' from source: set_fact 28011 1726882550.80496: variable 'interface' from source: set_fact 28011 1726882550.80500: variable 'interface' from source: set_fact 28011 1726882550.80541: variable 'interface' from source: set_fact 28011 1726882550.80569: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28011 1726882550.80573: when evaluation is False, skipping this task 28011 1726882550.80575: _execute() done 28011 1726882550.80577: dumping result to json 28011 1726882550.80579: done dumping result, returning 28011 1726882550.80586: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-962d-7c65-00000000006c] 28011 1726882550.80594: sending task result for task 12673a56-9f93-962d-7c65-00000000006c 28011 1726882550.80678: done sending task result for task 12673a56-9f93-962d-7c65-00000000006c 28011 1726882550.80680: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28011 1726882550.80733: no more pending results, returning what we have 28011 1726882550.80737: results queue empty 28011 1726882550.80737: checking for any_errors_fatal 28011 1726882550.80743: done checking for any_errors_fatal 28011 1726882550.80744: checking for max_fail_percentage 28011 1726882550.80745: done checking for max_fail_percentage 28011 1726882550.80746: checking to see if all hosts have failed and the running result is not ok 28011 1726882550.80747: done checking to see if all hosts have failed 28011 1726882550.80747: getting the remaining hosts for this loop 28011 1726882550.80749: done getting the remaining hosts for this loop 28011 1726882550.80752: getting the next task for host managed_node1 28011 1726882550.80758: done getting next task for host managed_node1 28011 1726882550.80761: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28011 1726882550.80764: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882550.80781: getting variables 28011 1726882550.80783: in VariableManager get_vars() 28011 1726882550.80832: Calling all_inventory to load vars for managed_node1 28011 1726882550.80835: Calling groups_inventory to load vars for managed_node1 28011 1726882550.80837: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882550.80846: Calling all_plugins_play to load vars for managed_node1 28011 1726882550.80848: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882550.80850: Calling groups_plugins_play to load vars for managed_node1 28011 1726882550.81669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882550.83158: done with get_vars() 28011 1726882550.83174: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28011 1726882550.83231: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:35:50 -0400 (0:00:00.116) 0:00:20.383 ****** 28011 1726882550.83253: entering _queue_task() for managed_node1/yum 28011 1726882550.83476: worker is 1 (out of 1 available) 28011 1726882550.83488: exiting _queue_task() for managed_node1/yum 28011 1726882550.83505: done queuing things up, now waiting for results queue to drain 28011 1726882550.83507: waiting for pending results... 28011 1726882550.83679: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28011 1726882550.83758: in run() - task 12673a56-9f93-962d-7c65-00000000006d 28011 1726882550.83769: variable 'ansible_search_path' from source: unknown 28011 1726882550.83773: variable 'ansible_search_path' from source: unknown 28011 1726882550.83802: calling self._execute() 28011 1726882550.83873: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882550.83878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882550.83887: variable 'omit' from source: magic vars 28011 1726882550.84152: variable 'ansible_distribution_major_version' from source: facts 28011 1726882550.84163: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882550.84283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882550.85730: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882550.85773: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882550.85802: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882550.85828: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882550.85848: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882550.85905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882550.85927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882550.85944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.85969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882550.85980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882550.86049: variable 'ansible_distribution_major_version' from source: facts 28011 1726882550.86061: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28011 1726882550.86063: when evaluation is False, skipping this task 28011 1726882550.86066: _execute() done 28011 1726882550.86068: dumping result to json 28011 1726882550.86070: done dumping result, returning 28011 1726882550.86078: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-962d-7c65-00000000006d] 28011 1726882550.86081: sending task result for task 12673a56-9f93-962d-7c65-00000000006d 28011 1726882550.86162: done sending task result for task 12673a56-9f93-962d-7c65-00000000006d 28011 1726882550.86165: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28011 1726882550.86213: no more pending results, returning what we have 28011 1726882550.86216: results queue empty 28011 1726882550.86217: checking for any_errors_fatal 28011 1726882550.86225: done checking for any_errors_fatal 28011 1726882550.86225: checking for max_fail_percentage 28011 1726882550.86227: done checking for max_fail_percentage 28011 1726882550.86228: checking to see if all hosts have failed and the running result is not ok 28011 1726882550.86229: done checking to see if all hosts have failed 28011 1726882550.86229: getting the remaining hosts for this loop 28011 1726882550.86231: done getting the remaining hosts for this loop 28011 1726882550.86234: getting the next task for host managed_node1 28011 1726882550.86240: done getting next task for host managed_node1 28011 1726882550.86243: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28011 1726882550.86246: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882550.86263: getting variables 28011 1726882550.86264: in VariableManager get_vars() 28011 1726882550.86301: Calling all_inventory to load vars for managed_node1 28011 1726882550.86304: Calling groups_inventory to load vars for managed_node1 28011 1726882550.86306: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882550.86314: Calling all_plugins_play to load vars for managed_node1 28011 1726882550.86317: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882550.86319: Calling groups_plugins_play to load vars for managed_node1 28011 1726882550.87070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882550.87981: done with get_vars() 28011 1726882550.88004: done getting variables 28011 1726882550.88059: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:35:50 -0400 (0:00:00.048) 0:00:20.432 ****** 28011 1726882550.88097: entering _queue_task() for managed_node1/fail 28011 1726882550.88376: worker is 1 (out of 1 available) 28011 1726882550.88388: exiting _queue_task() for managed_node1/fail 28011 1726882550.88513: done queuing things up, now waiting for results queue to drain 28011 1726882550.88515: waiting for pending results... 28011 1726882550.88811: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28011 1726882550.88852: in run() - task 12673a56-9f93-962d-7c65-00000000006e 28011 1726882550.88863: variable 'ansible_search_path' from source: unknown 28011 1726882550.88866: variable 'ansible_search_path' from source: unknown 28011 1726882550.88898: calling self._execute() 28011 1726882550.88966: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882550.88970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882550.88974: variable 'omit' from source: magic vars 28011 1726882550.89247: variable 'ansible_distribution_major_version' from source: facts 28011 1726882550.89256: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882550.89364: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882550.89481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882550.91071: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882550.91115: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882550.91147: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882550.91171: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882550.91195: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882550.91254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882550.91272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882550.91398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.91401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882550.91403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882550.91405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882550.91424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882550.91448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.91485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882550.91506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882550.91544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882550.91567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882550.91595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.91633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882550.91649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882550.91834: variable 'network_connections' from source: task vars 28011 1726882550.91850: variable 'interface' from source: set_fact 28011 1726882550.91921: variable 'interface' from source: set_fact 28011 1726882550.91930: variable 'interface' from source: set_fact 28011 1726882550.91973: variable 'interface' from source: set_fact 28011 1726882550.92031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882550.92149: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882550.92175: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882550.92201: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882550.92223: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882550.92255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882550.92269: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882550.92286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882550.92308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882550.92355: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882550.92504: variable 'network_connections' from source: task vars 28011 1726882550.92508: variable 'interface' from source: set_fact 28011 1726882550.92550: variable 'interface' from source: set_fact 28011 1726882550.92557: variable 'interface' from source: set_fact 28011 1726882550.92603: variable 'interface' from source: set_fact 28011 1726882550.92629: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28011 1726882550.92632: when evaluation is False, skipping this task 28011 1726882550.92635: _execute() done 28011 1726882550.92637: dumping result to json 28011 1726882550.92640: done dumping result, returning 28011 1726882550.92646: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-962d-7c65-00000000006e] 28011 1726882550.92657: sending task result for task 12673a56-9f93-962d-7c65-00000000006e 28011 1726882550.92735: done sending task result for task 12673a56-9f93-962d-7c65-00000000006e 28011 1726882550.92738: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28011 1726882550.92827: no more pending results, returning what we have 28011 1726882550.92830: results queue empty 28011 1726882550.92830: checking for any_errors_fatal 28011 1726882550.92836: done checking for any_errors_fatal 28011 1726882550.92837: checking for max_fail_percentage 28011 1726882550.92839: done checking for max_fail_percentage 28011 1726882550.92839: checking to see if all hosts have failed and the running result is not ok 28011 1726882550.92840: done checking to see if all hosts have failed 28011 1726882550.92841: getting the remaining hosts for this loop 28011 1726882550.92842: done getting the remaining hosts for this loop 28011 1726882550.92845: getting the next task for host managed_node1 28011 1726882550.92851: done getting next task for host managed_node1 28011 1726882550.92854: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28011 1726882550.92857: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882550.92872: getting variables 28011 1726882550.92874: in VariableManager get_vars() 28011 1726882550.92911: Calling all_inventory to load vars for managed_node1 28011 1726882550.92914: Calling groups_inventory to load vars for managed_node1 28011 1726882550.92916: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882550.92924: Calling all_plugins_play to load vars for managed_node1 28011 1726882550.92926: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882550.92929: Calling groups_plugins_play to load vars for managed_node1 28011 1726882550.93818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882550.99533: done with get_vars() 28011 1726882550.99557: done getting variables 28011 1726882550.99609: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:35:50 -0400 (0:00:00.115) 0:00:20.547 ****** 28011 1726882550.99638: entering _queue_task() for managed_node1/package 28011 1726882550.99978: worker is 1 (out of 1 available) 28011 1726882550.99991: exiting _queue_task() for managed_node1/package 28011 1726882551.00205: done queuing things up, now waiting for results queue to drain 28011 1726882551.00207: waiting for pending results... 28011 1726882551.00336: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 28011 1726882551.00429: in run() - task 12673a56-9f93-962d-7c65-00000000006f 28011 1726882551.00450: variable 'ansible_search_path' from source: unknown 28011 1726882551.00457: variable 'ansible_search_path' from source: unknown 28011 1726882551.00500: calling self._execute() 28011 1726882551.00602: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882551.00613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882551.00653: variable 'omit' from source: magic vars 28011 1726882551.01006: variable 'ansible_distribution_major_version' from source: facts 28011 1726882551.01023: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882551.01207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882551.01525: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882551.01529: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882551.01599: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882551.01638: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882551.01749: variable 'network_packages' from source: role '' defaults 28011 1726882551.01853: variable '__network_provider_setup' from source: role '' defaults 28011 1726882551.01866: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882551.01937: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882551.01954: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882551.02026: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882551.02286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882551.04188: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882551.04263: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882551.04323: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882551.04365: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882551.04402: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882551.04488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.04529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.04564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.04598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.04612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.04645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.04660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.04678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.04707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.04722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.04876: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28011 1726882551.04956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.04973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.04992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.05018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.05029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.05097: variable 'ansible_python' from source: facts 28011 1726882551.05118: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28011 1726882551.05177: variable '__network_wpa_supplicant_required' from source: role '' defaults 28011 1726882551.05234: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28011 1726882551.05322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.05340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.05356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.05384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.05457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.05461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.05475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.05478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.05482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.05499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.05600: variable 'network_connections' from source: task vars 28011 1726882551.05604: variable 'interface' from source: set_fact 28011 1726882551.05671: variable 'interface' from source: set_fact 28011 1726882551.05678: variable 'interface' from source: set_fact 28011 1726882551.05754: variable 'interface' from source: set_fact 28011 1726882551.05813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882551.05833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882551.05853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.05874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882551.05911: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882551.06097: variable 'network_connections' from source: task vars 28011 1726882551.06100: variable 'interface' from source: set_fact 28011 1726882551.06171: variable 'interface' from source: set_fact 28011 1726882551.06178: variable 'interface' from source: set_fact 28011 1726882551.06251: variable 'interface' from source: set_fact 28011 1726882551.06299: variable '__network_packages_default_wireless' from source: role '' defaults 28011 1726882551.06351: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882551.06681: variable 'network_connections' from source: task vars 28011 1726882551.06685: variable 'interface' from source: set_fact 28011 1726882551.06925: variable 'interface' from source: set_fact 28011 1726882551.06928: variable 'interface' from source: set_fact 28011 1726882551.06933: variable 'interface' from source: set_fact 28011 1726882551.06936: variable '__network_packages_default_team' from source: role '' defaults 28011 1726882551.06973: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882551.07282: variable 'network_connections' from source: task vars 28011 1726882551.07299: variable 'interface' from source: set_fact 28011 1726882551.07364: variable 'interface' from source: set_fact 28011 1726882551.07375: variable 'interface' from source: set_fact 28011 1726882551.07450: variable 'interface' from source: set_fact 28011 1726882551.07522: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882551.07584: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882551.07602: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882551.07663: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882551.07909: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28011 1726882551.08381: variable 'network_connections' from source: task vars 28011 1726882551.08396: variable 'interface' from source: set_fact 28011 1726882551.08459: variable 'interface' from source: set_fact 28011 1726882551.08470: variable 'interface' from source: set_fact 28011 1726882551.08534: variable 'interface' from source: set_fact 28011 1726882551.08553: variable 'ansible_distribution' from source: facts 28011 1726882551.08561: variable '__network_rh_distros' from source: role '' defaults 28011 1726882551.08569: variable 'ansible_distribution_major_version' from source: facts 28011 1726882551.08599: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28011 1726882551.08768: variable 'ansible_distribution' from source: facts 28011 1726882551.08777: variable '__network_rh_distros' from source: role '' defaults 28011 1726882551.08786: variable 'ansible_distribution_major_version' from source: facts 28011 1726882551.08898: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28011 1726882551.08972: variable 'ansible_distribution' from source: facts 28011 1726882551.08981: variable '__network_rh_distros' from source: role '' defaults 28011 1726882551.08992: variable 'ansible_distribution_major_version' from source: facts 28011 1726882551.09032: variable 'network_provider' from source: set_fact 28011 1726882551.09052: variable 'ansible_facts' from source: unknown 28011 1726882551.09765: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28011 1726882551.09794: when evaluation is False, skipping this task 28011 1726882551.09805: _execute() done 28011 1726882551.09813: dumping result to json 28011 1726882551.09820: done dumping result, returning 28011 1726882551.09831: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-962d-7c65-00000000006f] 28011 1726882551.09839: sending task result for task 12673a56-9f93-962d-7c65-00000000006f 28011 1726882551.10008: done sending task result for task 12673a56-9f93-962d-7c65-00000000006f 28011 1726882551.10012: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28011 1726882551.10060: no more pending results, returning what we have 28011 1726882551.10063: results queue empty 28011 1726882551.10064: checking for any_errors_fatal 28011 1726882551.10071: done checking for any_errors_fatal 28011 1726882551.10072: checking for max_fail_percentage 28011 1726882551.10074: done checking for max_fail_percentage 28011 1726882551.10074: checking to see if all hosts have failed and the running result is not ok 28011 1726882551.10075: done checking to see if all hosts have failed 28011 1726882551.10076: getting the remaining hosts for this loop 28011 1726882551.10078: done getting the remaining hosts for this loop 28011 1726882551.10081: getting the next task for host managed_node1 28011 1726882551.10087: done getting next task for host managed_node1 28011 1726882551.10091: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28011 1726882551.10094: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882551.10120: getting variables 28011 1726882551.10121: in VariableManager get_vars() 28011 1726882551.10160: Calling all_inventory to load vars for managed_node1 28011 1726882551.10162: Calling groups_inventory to load vars for managed_node1 28011 1726882551.10164: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882551.10174: Calling all_plugins_play to load vars for managed_node1 28011 1726882551.10176: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882551.10179: Calling groups_plugins_play to load vars for managed_node1 28011 1726882551.11734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882551.13310: done with get_vars() 28011 1726882551.13338: done getting variables 28011 1726882551.13398: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:35:51 -0400 (0:00:00.137) 0:00:20.685 ****** 28011 1726882551.13429: entering _queue_task() for managed_node1/package 28011 1726882551.13761: worker is 1 (out of 1 available) 28011 1726882551.13773: exiting _queue_task() for managed_node1/package 28011 1726882551.13786: done queuing things up, now waiting for results queue to drain 28011 1726882551.13787: waiting for pending results... 28011 1726882551.14023: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28011 1726882551.14210: in run() - task 12673a56-9f93-962d-7c65-000000000070 28011 1726882551.14216: variable 'ansible_search_path' from source: unknown 28011 1726882551.14219: variable 'ansible_search_path' from source: unknown 28011 1726882551.14226: calling self._execute() 28011 1726882551.14315: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882551.14330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882551.14346: variable 'omit' from source: magic vars 28011 1726882551.14727: variable 'ansible_distribution_major_version' from source: facts 28011 1726882551.14762: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882551.14879: variable 'network_state' from source: role '' defaults 28011 1726882551.14887: Evaluated conditional (network_state != {}): False 28011 1726882551.14895: when evaluation is False, skipping this task 28011 1726882551.14903: _execute() done 28011 1726882551.14910: dumping result to json 28011 1726882551.14917: done dumping result, returning 28011 1726882551.14927: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-962d-7c65-000000000070] 28011 1726882551.14937: sending task result for task 12673a56-9f93-962d-7c65-000000000070 28011 1726882551.15046: done sending task result for task 12673a56-9f93-962d-7c65-000000000070 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882551.15096: no more pending results, returning what we have 28011 1726882551.15100: results queue empty 28011 1726882551.15101: checking for any_errors_fatal 28011 1726882551.15106: done checking for any_errors_fatal 28011 1726882551.15106: checking for max_fail_percentage 28011 1726882551.15108: done checking for max_fail_percentage 28011 1726882551.15109: checking to see if all hosts have failed and the running result is not ok 28011 1726882551.15109: done checking to see if all hosts have failed 28011 1726882551.15110: getting the remaining hosts for this loop 28011 1726882551.15111: done getting the remaining hosts for this loop 28011 1726882551.15114: getting the next task for host managed_node1 28011 1726882551.15121: done getting next task for host managed_node1 28011 1726882551.15124: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28011 1726882551.15127: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882551.15146: getting variables 28011 1726882551.15148: in VariableManager get_vars() 28011 1726882551.15184: Calling all_inventory to load vars for managed_node1 28011 1726882551.15187: Calling groups_inventory to load vars for managed_node1 28011 1726882551.15191: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882551.15205: Calling all_plugins_play to load vars for managed_node1 28011 1726882551.15208: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882551.15335: WORKER PROCESS EXITING 28011 1726882551.15341: Calling groups_plugins_play to load vars for managed_node1 28011 1726882551.16791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882551.18415: done with get_vars() 28011 1726882551.18450: done getting variables 28011 1726882551.18507: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:35:51 -0400 (0:00:00.051) 0:00:20.736 ****** 28011 1726882551.18537: entering _queue_task() for managed_node1/package 28011 1726882551.18901: worker is 1 (out of 1 available) 28011 1726882551.18915: exiting _queue_task() for managed_node1/package 28011 1726882551.18929: done queuing things up, now waiting for results queue to drain 28011 1726882551.18931: waiting for pending results... 28011 1726882551.19313: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28011 1726882551.19369: in run() - task 12673a56-9f93-962d-7c65-000000000071 28011 1726882551.19390: variable 'ansible_search_path' from source: unknown 28011 1726882551.19403: variable 'ansible_search_path' from source: unknown 28011 1726882551.19451: calling self._execute() 28011 1726882551.19555: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882551.19567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882551.19583: variable 'omit' from source: magic vars 28011 1726882551.20062: variable 'ansible_distribution_major_version' from source: facts 28011 1726882551.20066: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882551.20147: variable 'network_state' from source: role '' defaults 28011 1726882551.20167: Evaluated conditional (network_state != {}): False 28011 1726882551.20221: when evaluation is False, skipping this task 28011 1726882551.20229: _execute() done 28011 1726882551.20236: dumping result to json 28011 1726882551.20243: done dumping result, returning 28011 1726882551.20489: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-962d-7c65-000000000071] 28011 1726882551.20495: sending task result for task 12673a56-9f93-962d-7c65-000000000071 28011 1726882551.20566: done sending task result for task 12673a56-9f93-962d-7c65-000000000071 28011 1726882551.20569: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882551.20617: no more pending results, returning what we have 28011 1726882551.20620: results queue empty 28011 1726882551.20621: checking for any_errors_fatal 28011 1726882551.20631: done checking for any_errors_fatal 28011 1726882551.20632: checking for max_fail_percentage 28011 1726882551.20634: done checking for max_fail_percentage 28011 1726882551.20635: checking to see if all hosts have failed and the running result is not ok 28011 1726882551.20636: done checking to see if all hosts have failed 28011 1726882551.20636: getting the remaining hosts for this loop 28011 1726882551.20638: done getting the remaining hosts for this loop 28011 1726882551.20641: getting the next task for host managed_node1 28011 1726882551.20648: done getting next task for host managed_node1 28011 1726882551.20651: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28011 1726882551.20655: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882551.20676: getting variables 28011 1726882551.20678: in VariableManager get_vars() 28011 1726882551.20788: Calling all_inventory to load vars for managed_node1 28011 1726882551.20791: Calling groups_inventory to load vars for managed_node1 28011 1726882551.20899: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882551.20911: Calling all_plugins_play to load vars for managed_node1 28011 1726882551.20914: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882551.20917: Calling groups_plugins_play to load vars for managed_node1 28011 1726882551.22290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882551.23842: done with get_vars() 28011 1726882551.23863: done getting variables 28011 1726882551.23925: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:35:51 -0400 (0:00:00.054) 0:00:20.790 ****** 28011 1726882551.23959: entering _queue_task() for managed_node1/service 28011 1726882551.24261: worker is 1 (out of 1 available) 28011 1726882551.24274: exiting _queue_task() for managed_node1/service 28011 1726882551.24289: done queuing things up, now waiting for results queue to drain 28011 1726882551.24291: waiting for pending results... 28011 1726882551.24714: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28011 1726882551.24719: in run() - task 12673a56-9f93-962d-7c65-000000000072 28011 1726882551.24729: variable 'ansible_search_path' from source: unknown 28011 1726882551.24737: variable 'ansible_search_path' from source: unknown 28011 1726882551.24777: calling self._execute() 28011 1726882551.24873: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882551.24886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882551.24907: variable 'omit' from source: magic vars 28011 1726882551.25354: variable 'ansible_distribution_major_version' from source: facts 28011 1726882551.25357: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882551.25429: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882551.25639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882551.28650: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882551.28900: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882551.28904: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882551.29031: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882551.29064: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882551.29256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.29294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.29326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.29497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.29501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.29503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.29711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.29739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.29781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.29801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.30100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.30104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.30107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.30124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.30142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.30508: variable 'network_connections' from source: task vars 28011 1726882551.30531: variable 'interface' from source: set_fact 28011 1726882551.30615: variable 'interface' from source: set_fact 28011 1726882551.30798: variable 'interface' from source: set_fact 28011 1726882551.30834: variable 'interface' from source: set_fact 28011 1726882551.30951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882551.31359: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882551.31402: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882551.31598: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882551.31601: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882551.31841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882551.31844: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882551.31846: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.31848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882551.31960: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882551.32438: variable 'network_connections' from source: task vars 28011 1726882551.32501: variable 'interface' from source: set_fact 28011 1726882551.32567: variable 'interface' from source: set_fact 28011 1726882551.32798: variable 'interface' from source: set_fact 28011 1726882551.32801: variable 'interface' from source: set_fact 28011 1726882551.32910: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28011 1726882551.32917: when evaluation is False, skipping this task 28011 1726882551.33033: _execute() done 28011 1726882551.33036: dumping result to json 28011 1726882551.33038: done dumping result, returning 28011 1726882551.33040: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-962d-7c65-000000000072] 28011 1726882551.33050: sending task result for task 12673a56-9f93-962d-7c65-000000000072 28011 1726882551.33119: done sending task result for task 12673a56-9f93-962d-7c65-000000000072 28011 1726882551.33122: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28011 1726882551.33180: no more pending results, returning what we have 28011 1726882551.33183: results queue empty 28011 1726882551.33184: checking for any_errors_fatal 28011 1726882551.33191: done checking for any_errors_fatal 28011 1726882551.33192: checking for max_fail_percentage 28011 1726882551.33196: done checking for max_fail_percentage 28011 1726882551.33197: checking to see if all hosts have failed and the running result is not ok 28011 1726882551.33198: done checking to see if all hosts have failed 28011 1726882551.33199: getting the remaining hosts for this loop 28011 1726882551.33200: done getting the remaining hosts for this loop 28011 1726882551.33205: getting the next task for host managed_node1 28011 1726882551.33212: done getting next task for host managed_node1 28011 1726882551.33216: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28011 1726882551.33220: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882551.33239: getting variables 28011 1726882551.33241: in VariableManager get_vars() 28011 1726882551.33283: Calling all_inventory to load vars for managed_node1 28011 1726882551.33286: Calling groups_inventory to load vars for managed_node1 28011 1726882551.33289: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882551.33390: Calling all_plugins_play to load vars for managed_node1 28011 1726882551.33397: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882551.33401: Calling groups_plugins_play to load vars for managed_node1 28011 1726882551.35270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882551.37112: done with get_vars() 28011 1726882551.37135: done getting variables 28011 1726882551.37199: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:35:51 -0400 (0:00:00.132) 0:00:20.923 ****** 28011 1726882551.37232: entering _queue_task() for managed_node1/service 28011 1726882551.37975: worker is 1 (out of 1 available) 28011 1726882551.37987: exiting _queue_task() for managed_node1/service 28011 1726882551.38002: done queuing things up, now waiting for results queue to drain 28011 1726882551.38004: waiting for pending results... 28011 1726882551.38795: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28011 1726882551.38827: in run() - task 12673a56-9f93-962d-7c65-000000000073 28011 1726882551.38850: variable 'ansible_search_path' from source: unknown 28011 1726882551.38859: variable 'ansible_search_path' from source: unknown 28011 1726882551.38907: calling self._execute() 28011 1726882551.39195: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882551.39312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882551.39316: variable 'omit' from source: magic vars 28011 1726882551.40009: variable 'ansible_distribution_major_version' from source: facts 28011 1726882551.40027: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882551.40341: variable 'network_provider' from source: set_fact 28011 1726882551.40509: variable 'network_state' from source: role '' defaults 28011 1726882551.40512: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28011 1726882551.40515: variable 'omit' from source: magic vars 28011 1726882551.40517: variable 'omit' from source: magic vars 28011 1726882551.40626: variable 'network_service_name' from source: role '' defaults 28011 1726882551.40797: variable 'network_service_name' from source: role '' defaults 28011 1726882551.41300: variable '__network_provider_setup' from source: role '' defaults 28011 1726882551.41304: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882551.41306: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882551.41309: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882551.41311: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882551.41733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882551.44207: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882551.44598: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882551.44602: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882551.44604: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882551.44797: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882551.44801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.44804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.44806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.44808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.44944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.45102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.45132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.45167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.45299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.45320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.45805: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28011 1726882551.45970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.46009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.46039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.46081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.46108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.46188: variable 'ansible_python' from source: facts 28011 1726882551.46222: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28011 1726882551.46305: variable '__network_wpa_supplicant_required' from source: role '' defaults 28011 1726882551.46383: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28011 1726882551.46508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.46640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.46643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.46645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.46647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.46664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882551.46702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882551.46730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.46778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882551.46801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882551.46946: variable 'network_connections' from source: task vars 28011 1726882551.46961: variable 'interface' from source: set_fact 28011 1726882551.47034: variable 'interface' from source: set_fact 28011 1726882551.47299: variable 'interface' from source: set_fact 28011 1726882551.47302: variable 'interface' from source: set_fact 28011 1726882551.47305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882551.47472: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882551.47529: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882551.47571: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882551.47613: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882551.47680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882551.47718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882551.47795: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882551.47906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882551.48058: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882551.48567: variable 'network_connections' from source: task vars 28011 1726882551.48573: variable 'interface' from source: set_fact 28011 1726882551.48646: variable 'interface' from source: set_fact 28011 1726882551.48657: variable 'interface' from source: set_fact 28011 1726882551.48939: variable 'interface' from source: set_fact 28011 1726882551.49251: variable '__network_packages_default_wireless' from source: role '' defaults 28011 1726882551.49254: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882551.49554: variable 'network_connections' from source: task vars 28011 1726882551.49674: variable 'interface' from source: set_fact 28011 1726882551.49829: variable 'interface' from source: set_fact 28011 1726882551.49836: variable 'interface' from source: set_fact 28011 1726882551.50019: variable 'interface' from source: set_fact 28011 1726882551.50050: variable '__network_packages_default_team' from source: role '' defaults 28011 1726882551.50233: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882551.50878: variable 'network_connections' from source: task vars 28011 1726882551.50881: variable 'interface' from source: set_fact 28011 1726882551.50941: variable 'interface' from source: set_fact 28011 1726882551.50947: variable 'interface' from source: set_fact 28011 1726882551.51031: variable 'interface' from source: set_fact 28011 1726882551.51100: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882551.51158: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882551.51208: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882551.51232: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882551.51451: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28011 1726882551.51988: variable 'network_connections' from source: task vars 28011 1726882551.51996: variable 'interface' from source: set_fact 28011 1726882551.52054: variable 'interface' from source: set_fact 28011 1726882551.52064: variable 'interface' from source: set_fact 28011 1726882551.52124: variable 'interface' from source: set_fact 28011 1726882551.52138: variable 'ansible_distribution' from source: facts 28011 1726882551.52142: variable '__network_rh_distros' from source: role '' defaults 28011 1726882551.52148: variable 'ansible_distribution_major_version' from source: facts 28011 1726882551.52176: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28011 1726882551.52355: variable 'ansible_distribution' from source: facts 28011 1726882551.52358: variable '__network_rh_distros' from source: role '' defaults 28011 1726882551.52364: variable 'ansible_distribution_major_version' from source: facts 28011 1726882551.52383: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28011 1726882551.52695: variable 'ansible_distribution' from source: facts 28011 1726882551.52699: variable '__network_rh_distros' from source: role '' defaults 28011 1726882551.52733: variable 'ansible_distribution_major_version' from source: facts 28011 1726882551.52767: variable 'network_provider' from source: set_fact 28011 1726882551.52795: variable 'omit' from source: magic vars 28011 1726882551.52842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882551.52857: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882551.52874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882551.52891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882551.53060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882551.53063: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882551.53065: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882551.53067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882551.53069: Set connection var ansible_connection to ssh 28011 1726882551.53071: Set connection var ansible_pipelining to False 28011 1726882551.53073: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882551.53075: Set connection var ansible_shell_executable to /bin/sh 28011 1726882551.53077: Set connection var ansible_timeout to 10 28011 1726882551.53078: Set connection var ansible_shell_type to sh 28011 1726882551.53183: variable 'ansible_shell_executable' from source: unknown 28011 1726882551.53186: variable 'ansible_connection' from source: unknown 28011 1726882551.53189: variable 'ansible_module_compression' from source: unknown 28011 1726882551.53196: variable 'ansible_shell_type' from source: unknown 28011 1726882551.53198: variable 'ansible_shell_executable' from source: unknown 28011 1726882551.53200: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882551.53207: variable 'ansible_pipelining' from source: unknown 28011 1726882551.53209: variable 'ansible_timeout' from source: unknown 28011 1726882551.53211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882551.53220: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882551.53234: variable 'omit' from source: magic vars 28011 1726882551.53241: starting attempt loop 28011 1726882551.53244: running the handler 28011 1726882551.53506: variable 'ansible_facts' from source: unknown 28011 1726882551.54242: _low_level_execute_command(): starting 28011 1726882551.54256: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882551.55042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882551.55060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882551.55080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882551.55119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882551.55338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882551.56845: stdout chunk (state=3): >>>/root <<< 28011 1726882551.56951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882551.57001: stderr chunk (state=3): >>><<< 28011 1726882551.57018: stdout chunk (state=3): >>><<< 28011 1726882551.57042: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882551.57059: _low_level_execute_command(): starting 28011 1726882551.57068: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551 `" && echo ansible-tmp-1726882551.5704832-28996-161845672121551="` echo /root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551 `" ) && sleep 0' 28011 1726882551.57653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882551.57666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882551.57680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882551.57697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882551.57714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882551.57759: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882551.57823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882551.57838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882551.57866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882551.57938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882551.59796: stdout chunk (state=3): >>>ansible-tmp-1726882551.5704832-28996-161845672121551=/root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551 <<< 28011 1726882551.59936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882551.59950: stderr chunk (state=3): >>><<< 28011 1726882551.59958: stdout chunk (state=3): >>><<< 28011 1726882551.59977: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882551.5704832-28996-161845672121551=/root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882551.60200: variable 'ansible_module_compression' from source: unknown 28011 1726882551.60203: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28011 1726882551.60205: variable 'ansible_facts' from source: unknown 28011 1726882551.60356: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551/AnsiballZ_systemd.py 28011 1726882551.60544: Sending initial data 28011 1726882551.60553: Sent initial data (156 bytes) 28011 1726882551.61127: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882551.61141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882551.61155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882551.61173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882551.61276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882551.61303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882551.61374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882551.62885: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882551.62930: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882551.62991: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpok7oflit /root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551/AnsiballZ_systemd.py <<< 28011 1726882551.63004: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551/AnsiballZ_systemd.py" <<< 28011 1726882551.63037: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpok7oflit" to remote "/root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551/AnsiballZ_systemd.py" <<< 28011 1726882551.64936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882551.64940: stdout chunk (state=3): >>><<< 28011 1726882551.64942: stderr chunk (state=3): >>><<< 28011 1726882551.64944: done transferring module to remote 28011 1726882551.64946: _low_level_execute_command(): starting 28011 1726882551.64949: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551/ /root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551/AnsiballZ_systemd.py && sleep 0' 28011 1726882551.65531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882551.65545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882551.65600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882551.65670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882551.65685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882551.65718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882551.65788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882551.67536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882551.67539: stdout chunk (state=3): >>><<< 28011 1726882551.67546: stderr chunk (state=3): >>><<< 28011 1726882551.67576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882551.67579: _low_level_execute_command(): starting 28011 1726882551.67582: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551/AnsiballZ_systemd.py && sleep 0' 28011 1726882551.68175: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882551.68184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882551.68197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882551.68213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882551.68226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882551.68233: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882551.68298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882551.68301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882551.68346: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882551.68372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882551.68446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882551.96996: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10833920", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306467328", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1615053000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 28011 1726882551.97003: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28011 1726882551.98900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882551.98904: stdout chunk (state=3): >>><<< 28011 1726882551.98907: stderr chunk (state=3): >>><<< 28011 1726882551.98910: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10833920", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306467328", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1615053000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882551.99045: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882551.99071: _low_level_execute_command(): starting 28011 1726882551.99081: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882551.5704832-28996-161845672121551/ > /dev/null 2>&1 && sleep 0' 28011 1726882551.99744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882551.99759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882551.99775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882551.99814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882551.99835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882551.99923: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882551.99939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882551.99957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882551.99979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882552.00050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882552.01923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882552.01927: stdout chunk (state=3): >>><<< 28011 1726882552.01930: stderr chunk (state=3): >>><<< 28011 1726882552.01946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882552.01958: handler run complete 28011 1726882552.02106: attempt loop complete, returning result 28011 1726882552.02109: _execute() done 28011 1726882552.02112: dumping result to json 28011 1726882552.02114: done dumping result, returning 28011 1726882552.02116: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-962d-7c65-000000000073] 28011 1726882552.02118: sending task result for task 12673a56-9f93-962d-7c65-000000000073 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882552.02561: no more pending results, returning what we have 28011 1726882552.02565: results queue empty 28011 1726882552.02566: checking for any_errors_fatal 28011 1726882552.02573: done checking for any_errors_fatal 28011 1726882552.02574: checking for max_fail_percentage 28011 1726882552.02576: done checking for max_fail_percentage 28011 1726882552.02577: checking to see if all hosts have failed and the running result is not ok 28011 1726882552.02578: done checking to see if all hosts have failed 28011 1726882552.02579: getting the remaining hosts for this loop 28011 1726882552.02580: done getting the remaining hosts for this loop 28011 1726882552.02584: getting the next task for host managed_node1 28011 1726882552.02592: done getting next task for host managed_node1 28011 1726882552.02722: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28011 1726882552.02726: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882552.02739: getting variables 28011 1726882552.02741: in VariableManager get_vars() 28011 1726882552.02778: Calling all_inventory to load vars for managed_node1 28011 1726882552.02780: Calling groups_inventory to load vars for managed_node1 28011 1726882552.02783: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882552.02828: Calling all_plugins_play to load vars for managed_node1 28011 1726882552.02833: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882552.02839: done sending task result for task 12673a56-9f93-962d-7c65-000000000073 28011 1726882552.02842: WORKER PROCESS EXITING 28011 1726882552.02846: Calling groups_plugins_play to load vars for managed_node1 28011 1726882552.04542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882552.06417: done with get_vars() 28011 1726882552.06445: done getting variables 28011 1726882552.06510: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:35:52 -0400 (0:00:00.693) 0:00:21.616 ****** 28011 1726882552.06543: entering _queue_task() for managed_node1/service 28011 1726882552.06907: worker is 1 (out of 1 available) 28011 1726882552.06918: exiting _queue_task() for managed_node1/service 28011 1726882552.06931: done queuing things up, now waiting for results queue to drain 28011 1726882552.06932: waiting for pending results... 28011 1726882552.07225: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28011 1726882552.07343: in run() - task 12673a56-9f93-962d-7c65-000000000074 28011 1726882552.07364: variable 'ansible_search_path' from source: unknown 28011 1726882552.07372: variable 'ansible_search_path' from source: unknown 28011 1726882552.07423: calling self._execute() 28011 1726882552.07527: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882552.07539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882552.07554: variable 'omit' from source: magic vars 28011 1726882552.07954: variable 'ansible_distribution_major_version' from source: facts 28011 1726882552.07973: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882552.08098: variable 'network_provider' from source: set_fact 28011 1726882552.08110: Evaluated conditional (network_provider == "nm"): True 28011 1726882552.08207: variable '__network_wpa_supplicant_required' from source: role '' defaults 28011 1726882552.08291: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28011 1726882552.08453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882552.10608: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882552.10775: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882552.10779: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882552.10781: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882552.10798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882552.10900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882552.10934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882552.10962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882552.11016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882552.11036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882552.11084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882552.11120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882552.11149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882552.11192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882552.11218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882552.11261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882552.11292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882552.11329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882552.11423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882552.11426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882552.11547: variable 'network_connections' from source: task vars 28011 1726882552.11564: variable 'interface' from source: set_fact 28011 1726882552.11652: variable 'interface' from source: set_fact 28011 1726882552.11666: variable 'interface' from source: set_fact 28011 1726882552.11733: variable 'interface' from source: set_fact 28011 1726882552.11822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882552.12002: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882552.12043: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882552.12079: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882552.12182: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882552.12185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882552.12187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882552.12214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882552.12243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882552.12302: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882552.12563: variable 'network_connections' from source: task vars 28011 1726882552.12573: variable 'interface' from source: set_fact 28011 1726882552.12645: variable 'interface' from source: set_fact 28011 1726882552.12656: variable 'interface' from source: set_fact 28011 1726882552.12725: variable 'interface' from source: set_fact 28011 1726882552.12773: Evaluated conditional (__network_wpa_supplicant_required): False 28011 1726882552.12781: when evaluation is False, skipping this task 28011 1726882552.12787: _execute() done 28011 1726882552.12831: dumping result to json 28011 1726882552.12834: done dumping result, returning 28011 1726882552.12836: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-962d-7c65-000000000074] 28011 1726882552.12838: sending task result for task 12673a56-9f93-962d-7c65-000000000074 28011 1726882552.13137: done sending task result for task 12673a56-9f93-962d-7c65-000000000074 28011 1726882552.13140: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28011 1726882552.13187: no more pending results, returning what we have 28011 1726882552.13196: results queue empty 28011 1726882552.13198: checking for any_errors_fatal 28011 1726882552.13217: done checking for any_errors_fatal 28011 1726882552.13218: checking for max_fail_percentage 28011 1726882552.13220: done checking for max_fail_percentage 28011 1726882552.13221: checking to see if all hosts have failed and the running result is not ok 28011 1726882552.13222: done checking to see if all hosts have failed 28011 1726882552.13222: getting the remaining hosts for this loop 28011 1726882552.13224: done getting the remaining hosts for this loop 28011 1726882552.13228: getting the next task for host managed_node1 28011 1726882552.13234: done getting next task for host managed_node1 28011 1726882552.13238: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28011 1726882552.13241: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882552.13260: getting variables 28011 1726882552.13262: in VariableManager get_vars() 28011 1726882552.13478: Calling all_inventory to load vars for managed_node1 28011 1726882552.13480: Calling groups_inventory to load vars for managed_node1 28011 1726882552.13483: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882552.13497: Calling all_plugins_play to load vars for managed_node1 28011 1726882552.13500: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882552.13503: Calling groups_plugins_play to load vars for managed_node1 28011 1726882552.14819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882552.16356: done with get_vars() 28011 1726882552.16380: done getting variables 28011 1726882552.16446: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:35:52 -0400 (0:00:00.099) 0:00:21.716 ****** 28011 1726882552.16477: entering _queue_task() for managed_node1/service 28011 1726882552.16818: worker is 1 (out of 1 available) 28011 1726882552.16831: exiting _queue_task() for managed_node1/service 28011 1726882552.16844: done queuing things up, now waiting for results queue to drain 28011 1726882552.16845: waiting for pending results... 28011 1726882552.17131: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 28011 1726882552.17265: in run() - task 12673a56-9f93-962d-7c65-000000000075 28011 1726882552.17288: variable 'ansible_search_path' from source: unknown 28011 1726882552.17303: variable 'ansible_search_path' from source: unknown 28011 1726882552.17347: calling self._execute() 28011 1726882552.17450: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882552.17462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882552.17476: variable 'omit' from source: magic vars 28011 1726882552.17860: variable 'ansible_distribution_major_version' from source: facts 28011 1726882552.17877: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882552.18005: variable 'network_provider' from source: set_fact 28011 1726882552.18199: Evaluated conditional (network_provider == "initscripts"): False 28011 1726882552.18202: when evaluation is False, skipping this task 28011 1726882552.18205: _execute() done 28011 1726882552.18207: dumping result to json 28011 1726882552.18209: done dumping result, returning 28011 1726882552.18212: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-962d-7c65-000000000075] 28011 1726882552.18214: sending task result for task 12673a56-9f93-962d-7c65-000000000075 28011 1726882552.18277: done sending task result for task 12673a56-9f93-962d-7c65-000000000075 28011 1726882552.18280: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882552.18328: no more pending results, returning what we have 28011 1726882552.18332: results queue empty 28011 1726882552.18333: checking for any_errors_fatal 28011 1726882552.18346: done checking for any_errors_fatal 28011 1726882552.18347: checking for max_fail_percentage 28011 1726882552.18349: done checking for max_fail_percentage 28011 1726882552.18350: checking to see if all hosts have failed and the running result is not ok 28011 1726882552.18351: done checking to see if all hosts have failed 28011 1726882552.18351: getting the remaining hosts for this loop 28011 1726882552.18353: done getting the remaining hosts for this loop 28011 1726882552.18357: getting the next task for host managed_node1 28011 1726882552.18364: done getting next task for host managed_node1 28011 1726882552.18368: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28011 1726882552.18372: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882552.18401: getting variables 28011 1726882552.18403: in VariableManager get_vars() 28011 1726882552.18444: Calling all_inventory to load vars for managed_node1 28011 1726882552.18447: Calling groups_inventory to load vars for managed_node1 28011 1726882552.18450: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882552.18462: Calling all_plugins_play to load vars for managed_node1 28011 1726882552.18465: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882552.18468: Calling groups_plugins_play to load vars for managed_node1 28011 1726882552.20129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882552.21658: done with get_vars() 28011 1726882552.21681: done getting variables 28011 1726882552.21740: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:35:52 -0400 (0:00:00.052) 0:00:21.769 ****** 28011 1726882552.21771: entering _queue_task() for managed_node1/copy 28011 1726882552.22134: worker is 1 (out of 1 available) 28011 1726882552.22145: exiting _queue_task() for managed_node1/copy 28011 1726882552.22158: done queuing things up, now waiting for results queue to drain 28011 1726882552.22160: waiting for pending results... 28011 1726882552.22445: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28011 1726882552.22799: in run() - task 12673a56-9f93-962d-7c65-000000000076 28011 1726882552.22804: variable 'ansible_search_path' from source: unknown 28011 1726882552.22806: variable 'ansible_search_path' from source: unknown 28011 1726882552.22809: calling self._execute() 28011 1726882552.22812: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882552.22815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882552.22818: variable 'omit' from source: magic vars 28011 1726882552.23209: variable 'ansible_distribution_major_version' from source: facts 28011 1726882552.23230: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882552.23356: variable 'network_provider' from source: set_fact 28011 1726882552.23371: Evaluated conditional (network_provider == "initscripts"): False 28011 1726882552.23379: when evaluation is False, skipping this task 28011 1726882552.23387: _execute() done 28011 1726882552.23400: dumping result to json 28011 1726882552.23408: done dumping result, returning 28011 1726882552.23422: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-962d-7c65-000000000076] 28011 1726882552.23433: sending task result for task 12673a56-9f93-962d-7c65-000000000076 28011 1726882552.23698: done sending task result for task 12673a56-9f93-962d-7c65-000000000076 28011 1726882552.23701: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28011 1726882552.23749: no more pending results, returning what we have 28011 1726882552.23753: results queue empty 28011 1726882552.23754: checking for any_errors_fatal 28011 1726882552.23761: done checking for any_errors_fatal 28011 1726882552.23762: checking for max_fail_percentage 28011 1726882552.23764: done checking for max_fail_percentage 28011 1726882552.23765: checking to see if all hosts have failed and the running result is not ok 28011 1726882552.23766: done checking to see if all hosts have failed 28011 1726882552.23767: getting the remaining hosts for this loop 28011 1726882552.23768: done getting the remaining hosts for this loop 28011 1726882552.23772: getting the next task for host managed_node1 28011 1726882552.23779: done getting next task for host managed_node1 28011 1726882552.23783: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28011 1726882552.23787: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882552.23812: getting variables 28011 1726882552.23815: in VariableManager get_vars() 28011 1726882552.23858: Calling all_inventory to load vars for managed_node1 28011 1726882552.23861: Calling groups_inventory to load vars for managed_node1 28011 1726882552.23863: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882552.23876: Calling all_plugins_play to load vars for managed_node1 28011 1726882552.23879: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882552.23888: Calling groups_plugins_play to load vars for managed_node1 28011 1726882552.25611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882552.27680: done with get_vars() 28011 1726882552.27710: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:35:52 -0400 (0:00:00.060) 0:00:21.829 ****** 28011 1726882552.27800: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 28011 1726882552.28141: worker is 1 (out of 1 available) 28011 1726882552.28154: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 28011 1726882552.28168: done queuing things up, now waiting for results queue to drain 28011 1726882552.28169: waiting for pending results... 28011 1726882552.28474: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28011 1726882552.28625: in run() - task 12673a56-9f93-962d-7c65-000000000077 28011 1726882552.28645: variable 'ansible_search_path' from source: unknown 28011 1726882552.28653: variable 'ansible_search_path' from source: unknown 28011 1726882552.28696: calling self._execute() 28011 1726882552.28833: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882552.28838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882552.28840: variable 'omit' from source: magic vars 28011 1726882552.29206: variable 'ansible_distribution_major_version' from source: facts 28011 1726882552.29223: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882552.29233: variable 'omit' from source: magic vars 28011 1726882552.29305: variable 'omit' from source: magic vars 28011 1726882552.29485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882552.31804: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882552.31871: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882552.31999: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882552.32002: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882552.32004: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882552.32056: variable 'network_provider' from source: set_fact 28011 1726882552.32197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882552.32235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882552.32264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882552.32314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882552.32335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882552.32414: variable 'omit' from source: magic vars 28011 1726882552.32538: variable 'omit' from source: magic vars 28011 1726882552.32647: variable 'network_connections' from source: task vars 28011 1726882552.32668: variable 'interface' from source: set_fact 28011 1726882552.32738: variable 'interface' from source: set_fact 28011 1726882552.32751: variable 'interface' from source: set_fact 28011 1726882552.32820: variable 'interface' from source: set_fact 28011 1726882552.33021: variable 'omit' from source: magic vars 28011 1726882552.33034: variable '__lsr_ansible_managed' from source: task vars 28011 1726882552.33100: variable '__lsr_ansible_managed' from source: task vars 28011 1726882552.33600: Loaded config def from plugin (lookup/template) 28011 1726882552.33611: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28011 1726882552.33644: File lookup term: get_ansible_managed.j2 28011 1726882552.33652: variable 'ansible_search_path' from source: unknown 28011 1726882552.33663: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28011 1726882552.33679: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28011 1726882552.33709: variable 'ansible_search_path' from source: unknown 28011 1726882552.46424: variable 'ansible_managed' from source: unknown 28011 1726882552.46720: variable 'omit' from source: magic vars 28011 1726882552.46835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882552.46868: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882552.46930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882552.47022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882552.47038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882552.47071: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882552.47132: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882552.47141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882552.47451: Set connection var ansible_connection to ssh 28011 1726882552.47454: Set connection var ansible_pipelining to False 28011 1726882552.47456: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882552.47459: Set connection var ansible_shell_executable to /bin/sh 28011 1726882552.47460: Set connection var ansible_timeout to 10 28011 1726882552.47462: Set connection var ansible_shell_type to sh 28011 1726882552.47464: variable 'ansible_shell_executable' from source: unknown 28011 1726882552.47466: variable 'ansible_connection' from source: unknown 28011 1726882552.47468: variable 'ansible_module_compression' from source: unknown 28011 1726882552.47470: variable 'ansible_shell_type' from source: unknown 28011 1726882552.47472: variable 'ansible_shell_executable' from source: unknown 28011 1726882552.47474: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882552.47476: variable 'ansible_pipelining' from source: unknown 28011 1726882552.47478: variable 'ansible_timeout' from source: unknown 28011 1726882552.47480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882552.47802: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882552.47999: variable 'omit' from source: magic vars 28011 1726882552.48002: starting attempt loop 28011 1726882552.48004: running the handler 28011 1726882552.48006: _low_level_execute_command(): starting 28011 1726882552.48009: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882552.49862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882552.49878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882552.49888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882552.49937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882552.50018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882552.50035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882552.50120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882552.51747: stdout chunk (state=3): >>>/root <<< 28011 1726882552.51851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882552.51883: stderr chunk (state=3): >>><<< 28011 1726882552.51896: stdout chunk (state=3): >>><<< 28011 1726882552.52022: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882552.52030: _low_level_execute_command(): starting 28011 1726882552.52033: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381 `" && echo ansible-tmp-1726882552.5193202-29033-100638184185381="` echo /root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381 `" ) && sleep 0' 28011 1726882552.53084: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882552.53088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882552.53095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882552.53097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882552.53099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882552.53340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882552.53343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882552.53398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882552.55259: stdout chunk (state=3): >>>ansible-tmp-1726882552.5193202-29033-100638184185381=/root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381 <<< 28011 1726882552.55562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882552.55565: stdout chunk (state=3): >>><<< 28011 1726882552.55567: stderr chunk (state=3): >>><<< 28011 1726882552.55613: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882552.5193202-29033-100638184185381=/root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882552.55699: variable 'ansible_module_compression' from source: unknown 28011 1726882552.55710: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28011 1726882552.55750: variable 'ansible_facts' from source: unknown 28011 1726882552.55888: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381/AnsiballZ_network_connections.py 28011 1726882552.56015: Sending initial data 28011 1726882552.56040: Sent initial data (168 bytes) 28011 1726882552.56635: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882552.56654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882552.56668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882552.56695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882552.56761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882552.56845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882552.56881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882552.56967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882552.58586: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882552.58781: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882552.58916: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpqxy4db7i /root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381/AnsiballZ_network_connections.py <<< 28011 1726882552.59007: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381/AnsiballZ_network_connections.py" <<< 28011 1726882552.59042: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpqxy4db7i" to remote "/root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381/AnsiballZ_network_connections.py" <<< 28011 1726882552.60692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882552.60853: stdout chunk (state=3): >>><<< 28011 1726882552.60856: stderr chunk (state=3): >>><<< 28011 1726882552.60858: done transferring module to remote 28011 1726882552.60860: _low_level_execute_command(): starting 28011 1726882552.60862: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381/ /root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381/AnsiballZ_network_connections.py && sleep 0' 28011 1726882552.61553: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882552.61562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882552.61573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882552.61587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882552.61604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882552.61613: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882552.61620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882552.61635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882552.61642: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882552.61649: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28011 1726882552.61657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882552.61667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882552.61679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882552.61753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882552.61765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882552.61830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882552.63577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882552.63667: stderr chunk (state=3): >>><<< 28011 1726882552.63671: stdout chunk (state=3): >>><<< 28011 1726882552.63822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882552.63825: _low_level_execute_command(): starting 28011 1726882552.63827: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381/AnsiballZ_network_connections.py && sleep 0' 28011 1726882552.64710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882552.64746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882552.64830: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882552.64881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882552.64959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882552.92158: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 (is-modified)\n[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28011 1726882552.93876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882552.93908: stderr chunk (state=3): >>><<< 28011 1726882552.93911: stdout chunk (state=3): >>><<< 28011 1726882552.93930: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 (is-modified)\n[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882552.93978: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26'], 'route': [{'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 'custom'}, {'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 'custom'}, {'network': '192.0.2.64', 'prefix': 26, 'gateway': '198.51.100.8', 'metric': 50, 'table': 'custom', 'src': '198.51.100.3'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882552.93985: _low_level_execute_command(): starting 28011 1726882552.93994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882552.5193202-29033-100638184185381/ > /dev/null 2>&1 && sleep 0' 28011 1726882552.94450: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882552.94454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882552.94456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882552.94458: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882552.94460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882552.94516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882552.94520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882552.94522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882552.94566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882552.96345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882552.96370: stderr chunk (state=3): >>><<< 28011 1726882552.96373: stdout chunk (state=3): >>><<< 28011 1726882552.96386: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882552.96394: handler run complete 28011 1726882552.96432: attempt loop complete, returning result 28011 1726882552.96435: _execute() done 28011 1726882552.96437: dumping result to json 28011 1726882552.96444: done dumping result, returning 28011 1726882552.96453: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-962d-7c65-000000000077] 28011 1726882552.96455: sending task result for task 12673a56-9f93-962d-7c65-000000000077 28011 1726882552.96567: done sending task result for task 12673a56-9f93-962d-7c65-000000000077 28011 1726882552.96569: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 (is-modified) [005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied 28011 1726882552.96715: no more pending results, returning what we have 28011 1726882552.96719: results queue empty 28011 1726882552.96720: checking for any_errors_fatal 28011 1726882552.96727: done checking for any_errors_fatal 28011 1726882552.96727: checking for max_fail_percentage 28011 1726882552.96729: done checking for max_fail_percentage 28011 1726882552.96730: checking to see if all hosts have failed and the running result is not ok 28011 1726882552.96731: done checking to see if all hosts have failed 28011 1726882552.96731: getting the remaining hosts for this loop 28011 1726882552.96733: done getting the remaining hosts for this loop 28011 1726882552.96736: getting the next task for host managed_node1 28011 1726882552.96740: done getting next task for host managed_node1 28011 1726882552.96743: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28011 1726882552.96746: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882552.96757: getting variables 28011 1726882552.96758: in VariableManager get_vars() 28011 1726882552.96794: Calling all_inventory to load vars for managed_node1 28011 1726882552.96797: Calling groups_inventory to load vars for managed_node1 28011 1726882552.96799: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882552.96814: Calling all_plugins_play to load vars for managed_node1 28011 1726882552.96818: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882552.96821: Calling groups_plugins_play to load vars for managed_node1 28011 1726882552.97741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882552.98603: done with get_vars() 28011 1726882552.98618: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:35:52 -0400 (0:00:00.708) 0:00:22.538 ****** 28011 1726882552.98681: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 28011 1726882552.98954: worker is 1 (out of 1 available) 28011 1726882552.98966: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 28011 1726882552.98979: done queuing things up, now waiting for results queue to drain 28011 1726882552.98980: waiting for pending results... 28011 1726882552.99309: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 28011 1726882552.99427: in run() - task 12673a56-9f93-962d-7c65-000000000078 28011 1726882552.99510: variable 'ansible_search_path' from source: unknown 28011 1726882552.99513: variable 'ansible_search_path' from source: unknown 28011 1726882552.99515: calling self._execute() 28011 1726882552.99602: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882552.99618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882552.99642: variable 'omit' from source: magic vars 28011 1726882553.00108: variable 'ansible_distribution_major_version' from source: facts 28011 1726882553.00117: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882553.00205: variable 'network_state' from source: role '' defaults 28011 1726882553.00220: Evaluated conditional (network_state != {}): False 28011 1726882553.00223: when evaluation is False, skipping this task 28011 1726882553.00225: _execute() done 28011 1726882553.00228: dumping result to json 28011 1726882553.00231: done dumping result, returning 28011 1726882553.00237: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-962d-7c65-000000000078] 28011 1726882553.00242: sending task result for task 12673a56-9f93-962d-7c65-000000000078 28011 1726882553.00326: done sending task result for task 12673a56-9f93-962d-7c65-000000000078 28011 1726882553.00328: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882553.00380: no more pending results, returning what we have 28011 1726882553.00385: results queue empty 28011 1726882553.00385: checking for any_errors_fatal 28011 1726882553.00402: done checking for any_errors_fatal 28011 1726882553.00403: checking for max_fail_percentage 28011 1726882553.00404: done checking for max_fail_percentage 28011 1726882553.00405: checking to see if all hosts have failed and the running result is not ok 28011 1726882553.00406: done checking to see if all hosts have failed 28011 1726882553.00407: getting the remaining hosts for this loop 28011 1726882553.00408: done getting the remaining hosts for this loop 28011 1726882553.00412: getting the next task for host managed_node1 28011 1726882553.00417: done getting next task for host managed_node1 28011 1726882553.00420: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28011 1726882553.00423: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882553.00449: getting variables 28011 1726882553.00451: in VariableManager get_vars() 28011 1726882553.00484: Calling all_inventory to load vars for managed_node1 28011 1726882553.00486: Calling groups_inventory to load vars for managed_node1 28011 1726882553.00488: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882553.00498: Calling all_plugins_play to load vars for managed_node1 28011 1726882553.00501: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882553.00503: Calling groups_plugins_play to load vars for managed_node1 28011 1726882553.01259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882553.02492: done with get_vars() 28011 1726882553.02520: done getting variables 28011 1726882553.02581: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:35:53 -0400 (0:00:00.039) 0:00:22.577 ****** 28011 1726882553.02626: entering _queue_task() for managed_node1/debug 28011 1726882553.03019: worker is 1 (out of 1 available) 28011 1726882553.03032: exiting _queue_task() for managed_node1/debug 28011 1726882553.03044: done queuing things up, now waiting for results queue to drain 28011 1726882553.03045: waiting for pending results... 28011 1726882553.03406: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28011 1726882553.03502: in run() - task 12673a56-9f93-962d-7c65-000000000079 28011 1726882553.03506: variable 'ansible_search_path' from source: unknown 28011 1726882553.03509: variable 'ansible_search_path' from source: unknown 28011 1726882553.03524: calling self._execute() 28011 1726882553.03634: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.03646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.03661: variable 'omit' from source: magic vars 28011 1726882553.04065: variable 'ansible_distribution_major_version' from source: facts 28011 1726882553.04154: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882553.04158: variable 'omit' from source: magic vars 28011 1726882553.04163: variable 'omit' from source: magic vars 28011 1726882553.04208: variable 'omit' from source: magic vars 28011 1726882553.04260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882553.04307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882553.04332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882553.04353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882553.04377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882553.04420: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882553.04429: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.04436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.04548: Set connection var ansible_connection to ssh 28011 1726882553.04561: Set connection var ansible_pipelining to False 28011 1726882553.04588: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882553.04595: Set connection var ansible_shell_executable to /bin/sh 28011 1726882553.04599: Set connection var ansible_timeout to 10 28011 1726882553.04702: Set connection var ansible_shell_type to sh 28011 1726882553.04705: variable 'ansible_shell_executable' from source: unknown 28011 1726882553.04708: variable 'ansible_connection' from source: unknown 28011 1726882553.04710: variable 'ansible_module_compression' from source: unknown 28011 1726882553.04712: variable 'ansible_shell_type' from source: unknown 28011 1726882553.04714: variable 'ansible_shell_executable' from source: unknown 28011 1726882553.04715: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.04717: variable 'ansible_pipelining' from source: unknown 28011 1726882553.04719: variable 'ansible_timeout' from source: unknown 28011 1726882553.04721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.04833: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882553.04849: variable 'omit' from source: magic vars 28011 1726882553.04858: starting attempt loop 28011 1726882553.04864: running the handler 28011 1726882553.05003: variable '__network_connections_result' from source: set_fact 28011 1726882553.05073: handler run complete 28011 1726882553.05100: attempt loop complete, returning result 28011 1726882553.05107: _execute() done 28011 1726882553.05113: dumping result to json 28011 1726882553.05120: done dumping result, returning 28011 1726882553.05139: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-962d-7c65-000000000079] 28011 1726882553.05149: sending task result for task 12673a56-9f93-962d-7c65-000000000079 ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 (is-modified)", "[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied" ] } 28011 1726882553.05421: no more pending results, returning what we have 28011 1726882553.05425: results queue empty 28011 1726882553.05426: checking for any_errors_fatal 28011 1726882553.05434: done checking for any_errors_fatal 28011 1726882553.05435: checking for max_fail_percentage 28011 1726882553.05437: done checking for max_fail_percentage 28011 1726882553.05438: checking to see if all hosts have failed and the running result is not ok 28011 1726882553.05439: done checking to see if all hosts have failed 28011 1726882553.05439: getting the remaining hosts for this loop 28011 1726882553.05441: done getting the remaining hosts for this loop 28011 1726882553.05444: getting the next task for host managed_node1 28011 1726882553.05451: done getting next task for host managed_node1 28011 1726882553.05457: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28011 1726882553.05461: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882553.05472: getting variables 28011 1726882553.05474: in VariableManager get_vars() 28011 1726882553.05520: Calling all_inventory to load vars for managed_node1 28011 1726882553.05523: Calling groups_inventory to load vars for managed_node1 28011 1726882553.05525: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882553.05535: Calling all_plugins_play to load vars for managed_node1 28011 1726882553.05539: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882553.05541: Calling groups_plugins_play to load vars for managed_node1 28011 1726882553.06117: done sending task result for task 12673a56-9f93-962d-7c65-000000000079 28011 1726882553.06121: WORKER PROCESS EXITING 28011 1726882553.07262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882553.08912: done with get_vars() 28011 1726882553.08939: done getting variables 28011 1726882553.09014: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:35:53 -0400 (0:00:00.064) 0:00:22.641 ****** 28011 1726882553.09055: entering _queue_task() for managed_node1/debug 28011 1726882553.09455: worker is 1 (out of 1 available) 28011 1726882553.09471: exiting _queue_task() for managed_node1/debug 28011 1726882553.09485: done queuing things up, now waiting for results queue to drain 28011 1726882553.09487: waiting for pending results... 28011 1726882553.09811: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28011 1726882553.09971: in run() - task 12673a56-9f93-962d-7c65-00000000007a 28011 1726882553.09997: variable 'ansible_search_path' from source: unknown 28011 1726882553.10006: variable 'ansible_search_path' from source: unknown 28011 1726882553.10053: calling self._execute() 28011 1726882553.10164: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.10179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.10200: variable 'omit' from source: magic vars 28011 1726882553.10587: variable 'ansible_distribution_major_version' from source: facts 28011 1726882553.10609: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882553.10691: variable 'omit' from source: magic vars 28011 1726882553.10696: variable 'omit' from source: magic vars 28011 1726882553.10728: variable 'omit' from source: magic vars 28011 1726882553.10767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882553.10808: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882553.10898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882553.10901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882553.10904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882553.10906: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882553.10914: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.10926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.11039: Set connection var ansible_connection to ssh 28011 1726882553.11052: Set connection var ansible_pipelining to False 28011 1726882553.11062: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882553.11069: Set connection var ansible_shell_executable to /bin/sh 28011 1726882553.11079: Set connection var ansible_timeout to 10 28011 1726882553.11086: Set connection var ansible_shell_type to sh 28011 1726882553.11117: variable 'ansible_shell_executable' from source: unknown 28011 1726882553.11126: variable 'ansible_connection' from source: unknown 28011 1726882553.11136: variable 'ansible_module_compression' from source: unknown 28011 1726882553.11148: variable 'ansible_shell_type' from source: unknown 28011 1726882553.11155: variable 'ansible_shell_executable' from source: unknown 28011 1726882553.11199: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.11202: variable 'ansible_pipelining' from source: unknown 28011 1726882553.11204: variable 'ansible_timeout' from source: unknown 28011 1726882553.11206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.11324: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882553.11339: variable 'omit' from source: magic vars 28011 1726882553.11347: starting attempt loop 28011 1726882553.11352: running the handler 28011 1726882553.11413: variable '__network_connections_result' from source: set_fact 28011 1726882553.11699: variable '__network_connections_result' from source: set_fact 28011 1726882553.11702: handler run complete 28011 1726882553.11728: attempt loop complete, returning result 28011 1726882553.11734: _execute() done 28011 1726882553.11741: dumping result to json 28011 1726882553.11750: done dumping result, returning 28011 1726882553.11763: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-962d-7c65-00000000007a] 28011 1726882553.11771: sending task result for task 12673a56-9f93-962d-7c65-00000000007a 28011 1726882553.11891: done sending task result for task 12673a56-9f93-962d-7c65-00000000007a 28011 1726882553.11900: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 (is-modified)\n[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40ca51ba-dbc0-41be-afe6-db495ae3e7c1 (is-modified)", "[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied" ] } } 28011 1726882553.12038: no more pending results, returning what we have 28011 1726882553.12042: results queue empty 28011 1726882553.12042: checking for any_errors_fatal 28011 1726882553.12049: done checking for any_errors_fatal 28011 1726882553.12050: checking for max_fail_percentage 28011 1726882553.12051: done checking for max_fail_percentage 28011 1726882553.12058: checking to see if all hosts have failed and the running result is not ok 28011 1726882553.12059: done checking to see if all hosts have failed 28011 1726882553.12059: getting the remaining hosts for this loop 28011 1726882553.12061: done getting the remaining hosts for this loop 28011 1726882553.12064: getting the next task for host managed_node1 28011 1726882553.12070: done getting next task for host managed_node1 28011 1726882553.12074: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28011 1726882553.12077: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882553.12092: getting variables 28011 1726882553.12095: in VariableManager get_vars() 28011 1726882553.12134: Calling all_inventory to load vars for managed_node1 28011 1726882553.12137: Calling groups_inventory to load vars for managed_node1 28011 1726882553.12140: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882553.12150: Calling all_plugins_play to load vars for managed_node1 28011 1726882553.12152: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882553.12155: Calling groups_plugins_play to load vars for managed_node1 28011 1726882553.13915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882553.15463: done with get_vars() 28011 1726882553.15489: done getting variables 28011 1726882553.15552: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:35:53 -0400 (0:00:00.065) 0:00:22.707 ****** 28011 1726882553.15586: entering _queue_task() for managed_node1/debug 28011 1726882553.16037: worker is 1 (out of 1 available) 28011 1726882553.16050: exiting _queue_task() for managed_node1/debug 28011 1726882553.16061: done queuing things up, now waiting for results queue to drain 28011 1726882553.16063: waiting for pending results... 28011 1726882553.16263: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28011 1726882553.16419: in run() - task 12673a56-9f93-962d-7c65-00000000007b 28011 1726882553.16599: variable 'ansible_search_path' from source: unknown 28011 1726882553.16603: variable 'ansible_search_path' from source: unknown 28011 1726882553.16605: calling self._execute() 28011 1726882553.16607: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.16609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.16612: variable 'omit' from source: magic vars 28011 1726882553.16977: variable 'ansible_distribution_major_version' from source: facts 28011 1726882553.16998: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882553.17399: variable 'network_state' from source: role '' defaults 28011 1726882553.17403: Evaluated conditional (network_state != {}): False 28011 1726882553.17405: when evaluation is False, skipping this task 28011 1726882553.17408: _execute() done 28011 1726882553.17410: dumping result to json 28011 1726882553.17412: done dumping result, returning 28011 1726882553.17415: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-962d-7c65-00000000007b] 28011 1726882553.17417: sending task result for task 12673a56-9f93-962d-7c65-00000000007b 28011 1726882553.17571: done sending task result for task 12673a56-9f93-962d-7c65-00000000007b 28011 1726882553.17574: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 28011 1726882553.17658: no more pending results, returning what we have 28011 1726882553.17662: results queue empty 28011 1726882553.17663: checking for any_errors_fatal 28011 1726882553.17678: done checking for any_errors_fatal 28011 1726882553.17679: checking for max_fail_percentage 28011 1726882553.17681: done checking for max_fail_percentage 28011 1726882553.17683: checking to see if all hosts have failed and the running result is not ok 28011 1726882553.17683: done checking to see if all hosts have failed 28011 1726882553.17684: getting the remaining hosts for this loop 28011 1726882553.17686: done getting the remaining hosts for this loop 28011 1726882553.17694: getting the next task for host managed_node1 28011 1726882553.17702: done getting next task for host managed_node1 28011 1726882553.17707: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28011 1726882553.17711: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882553.17734: getting variables 28011 1726882553.17736: in VariableManager get_vars() 28011 1726882553.17778: Calling all_inventory to load vars for managed_node1 28011 1726882553.17781: Calling groups_inventory to load vars for managed_node1 28011 1726882553.17784: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882553.17904: Calling all_plugins_play to load vars for managed_node1 28011 1726882553.17908: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882553.17912: Calling groups_plugins_play to load vars for managed_node1 28011 1726882553.20801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882553.23691: done with get_vars() 28011 1726882553.23726: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:35:53 -0400 (0:00:00.082) 0:00:22.789 ****** 28011 1726882553.23832: entering _queue_task() for managed_node1/ping 28011 1726882553.24198: worker is 1 (out of 1 available) 28011 1726882553.24210: exiting _queue_task() for managed_node1/ping 28011 1726882553.24222: done queuing things up, now waiting for results queue to drain 28011 1726882553.24224: waiting for pending results... 28011 1726882553.24507: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 28011 1726882553.24659: in run() - task 12673a56-9f93-962d-7c65-00000000007c 28011 1726882553.24677: variable 'ansible_search_path' from source: unknown 28011 1726882553.24683: variable 'ansible_search_path' from source: unknown 28011 1726882553.24728: calling self._execute() 28011 1726882553.24830: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.24846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.24861: variable 'omit' from source: magic vars 28011 1726882553.25279: variable 'ansible_distribution_major_version' from source: facts 28011 1726882553.25284: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882553.25287: variable 'omit' from source: magic vars 28011 1726882553.25340: variable 'omit' from source: magic vars 28011 1726882553.25377: variable 'omit' from source: magic vars 28011 1726882553.25431: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882553.25499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882553.25502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882553.25520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882553.25535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882553.25567: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882553.25576: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.25583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.25698: Set connection var ansible_connection to ssh 28011 1726882553.25715: Set connection var ansible_pipelining to False 28011 1726882553.25898: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882553.25901: Set connection var ansible_shell_executable to /bin/sh 28011 1726882553.25903: Set connection var ansible_timeout to 10 28011 1726882553.25905: Set connection var ansible_shell_type to sh 28011 1726882553.25907: variable 'ansible_shell_executable' from source: unknown 28011 1726882553.25908: variable 'ansible_connection' from source: unknown 28011 1726882553.25910: variable 'ansible_module_compression' from source: unknown 28011 1726882553.25912: variable 'ansible_shell_type' from source: unknown 28011 1726882553.25914: variable 'ansible_shell_executable' from source: unknown 28011 1726882553.25915: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.25917: variable 'ansible_pipelining' from source: unknown 28011 1726882553.25919: variable 'ansible_timeout' from source: unknown 28011 1726882553.25921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.26016: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882553.26037: variable 'omit' from source: magic vars 28011 1726882553.26045: starting attempt loop 28011 1726882553.26051: running the handler 28011 1726882553.26070: _low_level_execute_command(): starting 28011 1726882553.26081: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882553.26915: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882553.26935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882553.26956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882553.27223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882553.28848: stdout chunk (state=3): >>>/root <<< 28011 1726882553.28956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882553.28992: stderr chunk (state=3): >>><<< 28011 1726882553.28999: stdout chunk (state=3): >>><<< 28011 1726882553.29021: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882553.29086: _low_level_execute_command(): starting 28011 1726882553.29092: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163 `" && echo ansible-tmp-1726882553.2902029-29074-43429933012163="` echo /root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163 `" ) && sleep 0' 28011 1726882553.30281: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882553.30452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.30503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882553.30567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882553.30601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882553.30673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882553.32518: stdout chunk (state=3): >>>ansible-tmp-1726882553.2902029-29074-43429933012163=/root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163 <<< 28011 1726882553.32859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882553.32998: stderr chunk (state=3): >>><<< 28011 1726882553.33002: stdout chunk (state=3): >>><<< 28011 1726882553.33004: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882553.2902029-29074-43429933012163=/root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882553.33038: variable 'ansible_module_compression' from source: unknown 28011 1726882553.33077: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28011 1726882553.33217: variable 'ansible_facts' from source: unknown 28011 1726882553.33296: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163/AnsiballZ_ping.py 28011 1726882553.33770: Sending initial data 28011 1726882553.33774: Sent initial data (152 bytes) 28011 1726882553.34904: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882553.34917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.34930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.34982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882553.35108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882553.35199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882553.36687: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882553.36779: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882553.36786: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpw3z2nuy0 /root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163/AnsiballZ_ping.py <<< 28011 1726882553.36896: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163/AnsiballZ_ping.py" <<< 28011 1726882553.37054: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpw3z2nuy0" to remote "/root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163/AnsiballZ_ping.py" <<< 28011 1726882553.38426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882553.38541: stderr chunk (state=3): >>><<< 28011 1726882553.38544: stdout chunk (state=3): >>><<< 28011 1726882553.38554: done transferring module to remote 28011 1726882553.38571: _low_level_execute_command(): starting 28011 1726882553.38880: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163/ /root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163/AnsiballZ_ping.py && sleep 0' 28011 1726882553.39617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882553.39712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882553.41359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882553.41464: stderr chunk (state=3): >>><<< 28011 1726882553.41473: stdout chunk (state=3): >>><<< 28011 1726882553.41501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882553.41517: _low_level_execute_command(): starting 28011 1726882553.41528: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163/AnsiballZ_ping.py && sleep 0' 28011 1726882553.43114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.43172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882553.43197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882553.43214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882553.43304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882553.58097: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28011 1726882553.59531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882553.59540: stdout chunk (state=3): >>><<< 28011 1726882553.59551: stderr chunk (state=3): >>><<< 28011 1726882553.59578: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882553.59612: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882553.59688: _low_level_execute_command(): starting 28011 1726882553.59703: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882553.2902029-29074-43429933012163/ > /dev/null 2>&1 && sleep 0' 28011 1726882553.60551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882553.60560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882553.60570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882553.60584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882553.60601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882553.60614: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882553.60624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.60637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882553.60646: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882553.60651: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28011 1726882553.60658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882553.60668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882553.60679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882553.60687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882553.60696: stderr chunk (state=3): >>>debug2: match found <<< 28011 1726882553.60707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.60770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882553.60788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882553.60802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882553.60867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882553.62799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882553.62803: stdout chunk (state=3): >>><<< 28011 1726882553.62805: stderr chunk (state=3): >>><<< 28011 1726882553.62807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882553.62811: handler run complete 28011 1726882553.62818: attempt loop complete, returning result 28011 1726882553.62820: _execute() done 28011 1726882553.62822: dumping result to json 28011 1726882553.62824: done dumping result, returning 28011 1726882553.62826: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-962d-7c65-00000000007c] 28011 1726882553.62828: sending task result for task 12673a56-9f93-962d-7c65-00000000007c 28011 1726882553.62886: done sending task result for task 12673a56-9f93-962d-7c65-00000000007c 28011 1726882553.62888: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 28011 1726882553.62958: no more pending results, returning what we have 28011 1726882553.62961: results queue empty 28011 1726882553.62962: checking for any_errors_fatal 28011 1726882553.62969: done checking for any_errors_fatal 28011 1726882553.62970: checking for max_fail_percentage 28011 1726882553.62972: done checking for max_fail_percentage 28011 1726882553.62973: checking to see if all hosts have failed and the running result is not ok 28011 1726882553.62973: done checking to see if all hosts have failed 28011 1726882553.62974: getting the remaining hosts for this loop 28011 1726882553.62976: done getting the remaining hosts for this loop 28011 1726882553.62979: getting the next task for host managed_node1 28011 1726882553.62992: done getting next task for host managed_node1 28011 1726882553.63102: ^ task is: TASK: meta (role_complete) 28011 1726882553.63106: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882553.63119: getting variables 28011 1726882553.63121: in VariableManager get_vars() 28011 1726882553.63160: Calling all_inventory to load vars for managed_node1 28011 1726882553.63162: Calling groups_inventory to load vars for managed_node1 28011 1726882553.63164: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882553.63174: Calling all_plugins_play to load vars for managed_node1 28011 1726882553.63177: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882553.63179: Calling groups_plugins_play to load vars for managed_node1 28011 1726882553.65083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882553.66840: done with get_vars() 28011 1726882553.66871: done getting variables 28011 1726882553.66958: done queuing things up, now waiting for results queue to drain 28011 1726882553.66960: results queue empty 28011 1726882553.66961: checking for any_errors_fatal 28011 1726882553.66964: done checking for any_errors_fatal 28011 1726882553.66965: checking for max_fail_percentage 28011 1726882553.66966: done checking for max_fail_percentage 28011 1726882553.66966: checking to see if all hosts have failed and the running result is not ok 28011 1726882553.66967: done checking to see if all hosts have failed 28011 1726882553.66972: getting the remaining hosts for this loop 28011 1726882553.66974: done getting the remaining hosts for this loop 28011 1726882553.66976: getting the next task for host managed_node1 28011 1726882553.66981: done getting next task for host managed_node1 28011 1726882553.66983: ^ task is: TASK: Get the routes from the named route table 'custom' 28011 1726882553.66985: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882553.66987: getting variables 28011 1726882553.66988: in VariableManager get_vars() 28011 1726882553.67007: Calling all_inventory to load vars for managed_node1 28011 1726882553.67010: Calling groups_inventory to load vars for managed_node1 28011 1726882553.67012: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882553.67017: Calling all_plugins_play to load vars for managed_node1 28011 1726882553.67020: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882553.67022: Calling groups_plugins_play to load vars for managed_node1 28011 1726882553.68218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882553.69960: done with get_vars() 28011 1726882553.69980: done getting variables 28011 1726882553.70029: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the routes from the named route table 'custom'] ********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:121 Friday 20 September 2024 21:35:53 -0400 (0:00:00.462) 0:00:23.251 ****** 28011 1726882553.70056: entering _queue_task() for managed_node1/command 28011 1726882553.70425: worker is 1 (out of 1 available) 28011 1726882553.70441: exiting _queue_task() for managed_node1/command 28011 1726882553.70454: done queuing things up, now waiting for results queue to drain 28011 1726882553.70455: waiting for pending results... 28011 1726882553.70915: running TaskExecutor() for managed_node1/TASK: Get the routes from the named route table 'custom' 28011 1726882553.70921: in run() - task 12673a56-9f93-962d-7c65-0000000000ac 28011 1726882553.70924: variable 'ansible_search_path' from source: unknown 28011 1726882553.70927: calling self._execute() 28011 1726882553.71099: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.71102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.71105: variable 'omit' from source: magic vars 28011 1726882553.71460: variable 'ansible_distribution_major_version' from source: facts 28011 1726882553.71479: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882553.71497: variable 'omit' from source: magic vars 28011 1726882553.71522: variable 'omit' from source: magic vars 28011 1726882553.71571: variable 'omit' from source: magic vars 28011 1726882553.71624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882553.71699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882553.71774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882553.71777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882553.71780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882553.71782: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882553.71817: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.71845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.71975: Set connection var ansible_connection to ssh 28011 1726882553.72008: Set connection var ansible_pipelining to False 28011 1726882553.72021: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882553.72032: Set connection var ansible_shell_executable to /bin/sh 28011 1726882553.72102: Set connection var ansible_timeout to 10 28011 1726882553.72105: Set connection var ansible_shell_type to sh 28011 1726882553.72107: variable 'ansible_shell_executable' from source: unknown 28011 1726882553.72109: variable 'ansible_connection' from source: unknown 28011 1726882553.72111: variable 'ansible_module_compression' from source: unknown 28011 1726882553.72114: variable 'ansible_shell_type' from source: unknown 28011 1726882553.72116: variable 'ansible_shell_executable' from source: unknown 28011 1726882553.72117: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882553.72119: variable 'ansible_pipelining' from source: unknown 28011 1726882553.72123: variable 'ansible_timeout' from source: unknown 28011 1726882553.72132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882553.72329: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882553.72345: variable 'omit' from source: magic vars 28011 1726882553.72356: starting attempt loop 28011 1726882553.72363: running the handler 28011 1726882553.72385: _low_level_execute_command(): starting 28011 1726882553.72428: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882553.73108: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882553.73208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.73240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882553.73243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882553.73251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882553.73345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882553.74888: stdout chunk (state=3): >>>/root <<< 28011 1726882553.74990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882553.75022: stderr chunk (state=3): >>><<< 28011 1726882553.75026: stdout chunk (state=3): >>><<< 28011 1726882553.75047: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882553.75060: _low_level_execute_command(): starting 28011 1726882553.75066: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535 `" && echo ansible-tmp-1726882553.7504785-29096-97225543703535="` echo /root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535 `" ) && sleep 0' 28011 1726882553.75472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882553.75513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882553.75516: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882553.75519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882553.75529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882553.75532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.75571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882553.75576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882553.75626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882553.77459: stdout chunk (state=3): >>>ansible-tmp-1726882553.7504785-29096-97225543703535=/root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535 <<< 28011 1726882553.77562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882553.77592: stderr chunk (state=3): >>><<< 28011 1726882553.77598: stdout chunk (state=3): >>><<< 28011 1726882553.77607: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882553.7504785-29096-97225543703535=/root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882553.77635: variable 'ansible_module_compression' from source: unknown 28011 1726882553.77673: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28011 1726882553.77707: variable 'ansible_facts' from source: unknown 28011 1726882553.77760: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535/AnsiballZ_command.py 28011 1726882553.77860: Sending initial data 28011 1726882553.77863: Sent initial data (155 bytes) 28011 1726882553.78259: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882553.78291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882553.78297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882553.78300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.78302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882553.78304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.78351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882553.78359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882553.78403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882553.79915: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882553.79950: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882553.79996: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpg8pq6tu7 /root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535/AnsiballZ_command.py <<< 28011 1726882553.79998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535/AnsiballZ_command.py" <<< 28011 1726882553.80031: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpg8pq6tu7" to remote "/root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535/AnsiballZ_command.py" <<< 28011 1726882553.80549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882553.80588: stderr chunk (state=3): >>><<< 28011 1726882553.80592: stdout chunk (state=3): >>><<< 28011 1726882553.80634: done transferring module to remote 28011 1726882553.80643: _low_level_execute_command(): starting 28011 1726882553.80646: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535/ /root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535/AnsiballZ_command.py && sleep 0' 28011 1726882553.81166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882553.81170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882553.81173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.81183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882553.81223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882553.81271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882553.82972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882553.82997: stderr chunk (state=3): >>><<< 28011 1726882553.83000: stdout chunk (state=3): >>><<< 28011 1726882553.83015: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882553.83018: _low_level_execute_command(): starting 28011 1726882553.83023: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535/AnsiballZ_command.py && sleep 0' 28011 1726882553.83452: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882553.83455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.83457: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882553.83459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882553.83461: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882553.83496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882553.83510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882553.83568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882553.99100: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "custom"], "start": "2024-09-20 21:35:53.985185", "end": "2024-09-20 21:35:53.988712", "delta": "0:00:00.003527", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882554.00499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882554.00504: stdout chunk (state=3): >>><<< 28011 1726882554.00506: stderr chunk (state=3): >>><<< 28011 1726882554.00701: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "custom"], "start": "2024-09-20 21:35:53.985185", "end": "2024-09-20 21:35:53.988712", "delta": "0:00:00.003527", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882554.00706: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route show table custom', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882554.00711: _low_level_execute_command(): starting 28011 1726882554.00713: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882553.7504785-29096-97225543703535/ > /dev/null 2>&1 && sleep 0' 28011 1726882554.01265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882554.01320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882554.01324: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882554.01482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882554.01599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882554.03345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882554.03349: stdout chunk (state=3): >>><<< 28011 1726882554.03358: stderr chunk (state=3): >>><<< 28011 1726882554.03371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882554.03377: handler run complete 28011 1726882554.03406: Evaluated conditional (False): False 28011 1726882554.03417: attempt loop complete, returning result 28011 1726882554.03420: _execute() done 28011 1726882554.03422: dumping result to json 28011 1726882554.03428: done dumping result, returning 28011 1726882554.03643: done running TaskExecutor() for managed_node1/TASK: Get the routes from the named route table 'custom' [12673a56-9f93-962d-7c65-0000000000ac] 28011 1726882554.03646: sending task result for task 12673a56-9f93-962d-7c65-0000000000ac 28011 1726882554.03716: done sending task result for task 12673a56-9f93-962d-7c65-0000000000ac 28011 1726882554.03719: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "route", "show", "table", "custom" ], "delta": "0:00:00.003527", "end": "2024-09-20 21:35:53.988712", "rc": 0, "start": "2024-09-20 21:35:53.985185" } STDOUT: 192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 28011 1726882554.03965: no more pending results, returning what we have 28011 1726882554.03968: results queue empty 28011 1726882554.03969: checking for any_errors_fatal 28011 1726882554.03971: done checking for any_errors_fatal 28011 1726882554.03971: checking for max_fail_percentage 28011 1726882554.03973: done checking for max_fail_percentage 28011 1726882554.03974: checking to see if all hosts have failed and the running result is not ok 28011 1726882554.03975: done checking to see if all hosts have failed 28011 1726882554.03975: getting the remaining hosts for this loop 28011 1726882554.03977: done getting the remaining hosts for this loop 28011 1726882554.03980: getting the next task for host managed_node1 28011 1726882554.03985: done getting next task for host managed_node1 28011 1726882554.03988: ^ task is: TASK: Assert that the named route table 'custom' contains the specified route 28011 1726882554.03990: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882554.03994: getting variables 28011 1726882554.03996: in VariableManager get_vars() 28011 1726882554.04038: Calling all_inventory to load vars for managed_node1 28011 1726882554.04040: Calling groups_inventory to load vars for managed_node1 28011 1726882554.04043: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882554.04056: Calling all_plugins_play to load vars for managed_node1 28011 1726882554.04061: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882554.04064: Calling groups_plugins_play to load vars for managed_node1 28011 1726882554.07157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882554.10074: done with get_vars() 28011 1726882554.10305: done getting variables 28011 1726882554.10364: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the named route table 'custom' contains the specified route] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:127 Friday 20 September 2024 21:35:54 -0400 (0:00:00.403) 0:00:23.655 ****** 28011 1726882554.10395: entering _queue_task() for managed_node1/assert 28011 1726882554.10875: worker is 1 (out of 1 available) 28011 1726882554.10888: exiting _queue_task() for managed_node1/assert 28011 1726882554.10903: done queuing things up, now waiting for results queue to drain 28011 1726882554.10904: waiting for pending results... 28011 1726882554.11224: running TaskExecutor() for managed_node1/TASK: Assert that the named route table 'custom' contains the specified route 28011 1726882554.11324: in run() - task 12673a56-9f93-962d-7c65-0000000000ad 28011 1726882554.11500: variable 'ansible_search_path' from source: unknown 28011 1726882554.11504: calling self._execute() 28011 1726882554.11506: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882554.11510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882554.11513: variable 'omit' from source: magic vars 28011 1726882554.11890: variable 'ansible_distribution_major_version' from source: facts 28011 1726882554.11911: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882554.11923: variable 'omit' from source: magic vars 28011 1726882554.11947: variable 'omit' from source: magic vars 28011 1726882554.12000: variable 'omit' from source: magic vars 28011 1726882554.12048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882554.12098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882554.12123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882554.12145: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882554.12163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882554.12206: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882554.12286: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882554.12289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882554.12331: Set connection var ansible_connection to ssh 28011 1726882554.12345: Set connection var ansible_pipelining to False 28011 1726882554.12357: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882554.12367: Set connection var ansible_shell_executable to /bin/sh 28011 1726882554.12380: Set connection var ansible_timeout to 10 28011 1726882554.12403: Set connection var ansible_shell_type to sh 28011 1726882554.12433: variable 'ansible_shell_executable' from source: unknown 28011 1726882554.12442: variable 'ansible_connection' from source: unknown 28011 1726882554.12450: variable 'ansible_module_compression' from source: unknown 28011 1726882554.12457: variable 'ansible_shell_type' from source: unknown 28011 1726882554.12464: variable 'ansible_shell_executable' from source: unknown 28011 1726882554.12471: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882554.12479: variable 'ansible_pipelining' from source: unknown 28011 1726882554.12485: variable 'ansible_timeout' from source: unknown 28011 1726882554.12501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882554.12719: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882554.12723: variable 'omit' from source: magic vars 28011 1726882554.12725: starting attempt loop 28011 1726882554.12727: running the handler 28011 1726882554.12838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882554.13084: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882554.13132: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882554.13211: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882554.13251: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882554.13341: variable 'route_table_custom' from source: set_fact 28011 1726882554.13382: Evaluated conditional (route_table_custom.stdout is search("198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2")): True 28011 1726882554.13520: variable 'route_table_custom' from source: set_fact 28011 1726882554.13551: Evaluated conditional (route_table_custom.stdout is search("198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4")): True 28011 1726882554.13688: variable 'route_table_custom' from source: set_fact 28011 1726882554.13801: Evaluated conditional (route_table_custom.stdout is search("192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50")): True 28011 1726882554.13804: handler run complete 28011 1726882554.13807: attempt loop complete, returning result 28011 1726882554.13809: _execute() done 28011 1726882554.13811: dumping result to json 28011 1726882554.13813: done dumping result, returning 28011 1726882554.13815: done running TaskExecutor() for managed_node1/TASK: Assert that the named route table 'custom' contains the specified route [12673a56-9f93-962d-7c65-0000000000ad] 28011 1726882554.13816: sending task result for task 12673a56-9f93-962d-7c65-0000000000ad 28011 1726882554.13881: done sending task result for task 12673a56-9f93-962d-7c65-0000000000ad 28011 1726882554.13884: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 28011 1726882554.13938: no more pending results, returning what we have 28011 1726882554.13941: results queue empty 28011 1726882554.13942: checking for any_errors_fatal 28011 1726882554.13950: done checking for any_errors_fatal 28011 1726882554.13951: checking for max_fail_percentage 28011 1726882554.13953: done checking for max_fail_percentage 28011 1726882554.13955: checking to see if all hosts have failed and the running result is not ok 28011 1726882554.13955: done checking to see if all hosts have failed 28011 1726882554.13956: getting the remaining hosts for this loop 28011 1726882554.13957: done getting the remaining hosts for this loop 28011 1726882554.13961: getting the next task for host managed_node1 28011 1726882554.13967: done getting next task for host managed_node1 28011 1726882554.13969: ^ task is: TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 28011 1726882554.13972: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882554.13976: getting variables 28011 1726882554.13978: in VariableManager get_vars() 28011 1726882554.14034: Calling all_inventory to load vars for managed_node1 28011 1726882554.14036: Calling groups_inventory to load vars for managed_node1 28011 1726882554.14039: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882554.14051: Calling all_plugins_play to load vars for managed_node1 28011 1726882554.14054: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882554.14056: Calling groups_plugins_play to load vars for managed_node1 28011 1726882554.16419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882554.19347: done with get_vars() 28011 1726882554.19378: done getting variables TASK [Remove the dedicated test file in `/etc/iproute2/rt_tables.d/`] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:135 Friday 20 September 2024 21:35:54 -0400 (0:00:00.092) 0:00:23.748 ****** 28011 1726882554.19682: entering _queue_task() for managed_node1/file 28011 1726882554.20445: worker is 1 (out of 1 available) 28011 1726882554.20459: exiting _queue_task() for managed_node1/file 28011 1726882554.20470: done queuing things up, now waiting for results queue to drain 28011 1726882554.20471: waiting for pending results... 28011 1726882554.21013: running TaskExecutor() for managed_node1/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 28011 1726882554.21018: in run() - task 12673a56-9f93-962d-7c65-0000000000ae 28011 1726882554.21026: variable 'ansible_search_path' from source: unknown 28011 1726882554.21063: calling self._execute() 28011 1726882554.21500: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882554.21504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882554.21506: variable 'omit' from source: magic vars 28011 1726882554.22081: variable 'ansible_distribution_major_version' from source: facts 28011 1726882554.22254: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882554.22266: variable 'omit' from source: magic vars 28011 1726882554.22320: variable 'omit' from source: magic vars 28011 1726882554.22469: variable 'omit' from source: magic vars 28011 1726882554.22526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882554.22637: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882554.22669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882554.22743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882554.22783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882554.23198: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882554.23201: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882554.23203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882554.23205: Set connection var ansible_connection to ssh 28011 1726882554.23206: Set connection var ansible_pipelining to False 28011 1726882554.23208: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882554.23209: Set connection var ansible_shell_executable to /bin/sh 28011 1726882554.23211: Set connection var ansible_timeout to 10 28011 1726882554.23212: Set connection var ansible_shell_type to sh 28011 1726882554.23213: variable 'ansible_shell_executable' from source: unknown 28011 1726882554.23698: variable 'ansible_connection' from source: unknown 28011 1726882554.23701: variable 'ansible_module_compression' from source: unknown 28011 1726882554.23706: variable 'ansible_shell_type' from source: unknown 28011 1726882554.23708: variable 'ansible_shell_executable' from source: unknown 28011 1726882554.23710: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882554.23712: variable 'ansible_pipelining' from source: unknown 28011 1726882554.23713: variable 'ansible_timeout' from source: unknown 28011 1726882554.23715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882554.23855: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882554.23873: variable 'omit' from source: magic vars 28011 1726882554.23883: starting attempt loop 28011 1726882554.23895: running the handler 28011 1726882554.23919: _low_level_execute_command(): starting 28011 1726882554.23931: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882554.25173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882554.25192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882554.25347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882554.25370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882554.25381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882554.25465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882554.27133: stdout chunk (state=3): >>>/root <<< 28011 1726882554.27258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882554.27269: stdout chunk (state=3): >>><<< 28011 1726882554.27278: stderr chunk (state=3): >>><<< 28011 1726882554.27304: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882554.27322: _low_level_execute_command(): starting 28011 1726882554.27331: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997 `" && echo ansible-tmp-1726882554.273108-29126-172112836173997="` echo /root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997 `" ) && sleep 0' 28011 1726882554.28164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882554.28184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882554.28263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882554.28312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882554.28325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882554.28341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882554.28429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882554.30313: stdout chunk (state=3): >>>ansible-tmp-1726882554.273108-29126-172112836173997=/root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997 <<< 28011 1726882554.30479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882554.30482: stdout chunk (state=3): >>><<< 28011 1726882554.30484: stderr chunk (state=3): >>><<< 28011 1726882554.30506: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882554.273108-29126-172112836173997=/root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882554.30603: variable 'ansible_module_compression' from source: unknown 28011 1726882554.30633: ANSIBALLZ: Using lock for file 28011 1726882554.30640: ANSIBALLZ: Acquiring lock 28011 1726882554.30648: ANSIBALLZ: Lock acquired: 139767565768352 28011 1726882554.30655: ANSIBALLZ: Creating module 28011 1726882554.47745: ANSIBALLZ: Writing module into payload 28011 1726882554.47849: ANSIBALLZ: Writing module 28011 1726882554.47867: ANSIBALLZ: Renaming module 28011 1726882554.47872: ANSIBALLZ: Done creating module 28011 1726882554.47886: variable 'ansible_facts' from source: unknown 28011 1726882554.47936: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997/AnsiballZ_file.py 28011 1726882554.48030: Sending initial data 28011 1726882554.48033: Sent initial data (152 bytes) 28011 1726882554.48452: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882554.48488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882554.48496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882554.48499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882554.48501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882554.48503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882554.48541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882554.48544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882554.48603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882554.50142: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28011 1726882554.50149: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882554.50181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882554.50225: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpyriwldlj /root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997/AnsiballZ_file.py <<< 28011 1726882554.50228: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997/AnsiballZ_file.py" <<< 28011 1726882554.50264: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpyriwldlj" to remote "/root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997/AnsiballZ_file.py" <<< 28011 1726882554.50267: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997/AnsiballZ_file.py" <<< 28011 1726882554.50805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882554.50840: stderr chunk (state=3): >>><<< 28011 1726882554.50843: stdout chunk (state=3): >>><<< 28011 1726882554.50888: done transferring module to remote 28011 1726882554.50899: _low_level_execute_command(): starting 28011 1726882554.50903: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997/ /root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997/AnsiballZ_file.py && sleep 0' 28011 1726882554.51330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882554.51334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882554.51336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882554.51338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882554.51340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882554.51384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882554.51388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882554.51439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882554.53160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882554.53178: stderr chunk (state=3): >>><<< 28011 1726882554.53181: stdout chunk (state=3): >>><<< 28011 1726882554.53197: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882554.53201: _low_level_execute_command(): starting 28011 1726882554.53203: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997/AnsiballZ_file.py && sleep 0' 28011 1726882554.53632: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882554.53635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882554.53637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882554.53639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882554.53641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882554.53643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882554.53686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882554.53689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882554.53744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882554.69465: stdout chunk (state=3): >>> {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 28011 1726882554.71001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882554.71005: stdout chunk (state=3): >>><<< 28011 1726882554.71008: stderr chunk (state=3): >>><<< 28011 1726882554.71010: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882554.71013: done with _execute_module (file, {'state': 'absent', 'path': '/etc/iproute2/rt_tables.d/table.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882554.71015: _low_level_execute_command(): starting 28011 1726882554.71017: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882554.273108-29126-172112836173997/ > /dev/null 2>&1 && sleep 0' 28011 1726882554.71832: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882554.71836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882554.71857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882554.71861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882554.71900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882554.71955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882554.71979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882554.72062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882554.73958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882554.73969: stdout chunk (state=3): >>><<< 28011 1726882554.73981: stderr chunk (state=3): >>><<< 28011 1726882554.74010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882554.74022: handler run complete 28011 1726882554.74050: attempt loop complete, returning result 28011 1726882554.74057: _execute() done 28011 1726882554.74064: dumping result to json 28011 1726882554.74101: done dumping result, returning 28011 1726882554.74113: done running TaskExecutor() for managed_node1/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` [12673a56-9f93-962d-7c65-0000000000ae] 28011 1726882554.74121: sending task result for task 12673a56-9f93-962d-7c65-0000000000ae changed: [managed_node1] => { "changed": true, "path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent" } 28011 1726882554.74301: no more pending results, returning what we have 28011 1726882554.74308: results queue empty 28011 1726882554.74309: checking for any_errors_fatal 28011 1726882554.74315: done checking for any_errors_fatal 28011 1726882554.74316: checking for max_fail_percentage 28011 1726882554.74317: done checking for max_fail_percentage 28011 1726882554.74318: checking to see if all hosts have failed and the running result is not ok 28011 1726882554.74319: done checking to see if all hosts have failed 28011 1726882554.74320: getting the remaining hosts for this loop 28011 1726882554.74321: done getting the remaining hosts for this loop 28011 1726882554.74324: getting the next task for host managed_node1 28011 1726882554.74330: done getting next task for host managed_node1 28011 1726882554.74332: ^ task is: TASK: meta (flush_handlers) 28011 1726882554.74334: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882554.74339: getting variables 28011 1726882554.74340: in VariableManager get_vars() 28011 1726882554.74377: Calling all_inventory to load vars for managed_node1 28011 1726882554.74379: Calling groups_inventory to load vars for managed_node1 28011 1726882554.74381: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882554.74392: Calling all_plugins_play to load vars for managed_node1 28011 1726882554.74510: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882554.74515: Calling groups_plugins_play to load vars for managed_node1 28011 1726882554.75219: done sending task result for task 12673a56-9f93-962d-7c65-0000000000ae 28011 1726882554.75223: WORKER PROCESS EXITING 28011 1726882554.76389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882554.78015: done with get_vars() 28011 1726882554.78039: done getting variables 28011 1726882554.78112: in VariableManager get_vars() 28011 1726882554.78127: Calling all_inventory to load vars for managed_node1 28011 1726882554.78129: Calling groups_inventory to load vars for managed_node1 28011 1726882554.78131: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882554.78136: Calling all_plugins_play to load vars for managed_node1 28011 1726882554.78138: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882554.78141: Calling groups_plugins_play to load vars for managed_node1 28011 1726882554.83478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882554.85078: done with get_vars() 28011 1726882554.85108: done queuing things up, now waiting for results queue to drain 28011 1726882554.85110: results queue empty 28011 1726882554.85111: checking for any_errors_fatal 28011 1726882554.85114: done checking for any_errors_fatal 28011 1726882554.85115: checking for max_fail_percentage 28011 1726882554.85116: done checking for max_fail_percentage 28011 1726882554.85116: checking to see if all hosts have failed and the running result is not ok 28011 1726882554.85121: done checking to see if all hosts have failed 28011 1726882554.85123: getting the remaining hosts for this loop 28011 1726882554.85124: done getting the remaining hosts for this loop 28011 1726882554.85126: getting the next task for host managed_node1 28011 1726882554.85130: done getting next task for host managed_node1 28011 1726882554.85131: ^ task is: TASK: meta (flush_handlers) 28011 1726882554.85133: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882554.85136: getting variables 28011 1726882554.85137: in VariableManager get_vars() 28011 1726882554.85150: Calling all_inventory to load vars for managed_node1 28011 1726882554.85152: Calling groups_inventory to load vars for managed_node1 28011 1726882554.85154: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882554.85159: Calling all_plugins_play to load vars for managed_node1 28011 1726882554.85161: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882554.85164: Calling groups_plugins_play to load vars for managed_node1 28011 1726882554.86322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882554.88074: done with get_vars() 28011 1726882554.88092: done getting variables 28011 1726882554.88146: in VariableManager get_vars() 28011 1726882554.88159: Calling all_inventory to load vars for managed_node1 28011 1726882554.88162: Calling groups_inventory to load vars for managed_node1 28011 1726882554.88164: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882554.88169: Calling all_plugins_play to load vars for managed_node1 28011 1726882554.88171: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882554.88174: Calling groups_plugins_play to load vars for managed_node1 28011 1726882554.89314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882554.91988: done with get_vars() 28011 1726882554.92017: done queuing things up, now waiting for results queue to drain 28011 1726882554.92020: results queue empty 28011 1726882554.92020: checking for any_errors_fatal 28011 1726882554.92022: done checking for any_errors_fatal 28011 1726882554.92023: checking for max_fail_percentage 28011 1726882554.92024: done checking for max_fail_percentage 28011 1726882554.92024: checking to see if all hosts have failed and the running result is not ok 28011 1726882554.92025: done checking to see if all hosts have failed 28011 1726882554.92026: getting the remaining hosts for this loop 28011 1726882554.92027: done getting the remaining hosts for this loop 28011 1726882554.92029: getting the next task for host managed_node1 28011 1726882554.92032: done getting next task for host managed_node1 28011 1726882554.92033: ^ task is: None 28011 1726882554.92034: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882554.92036: done queuing things up, now waiting for results queue to drain 28011 1726882554.92036: results queue empty 28011 1726882554.92037: checking for any_errors_fatal 28011 1726882554.92038: done checking for any_errors_fatal 28011 1726882554.92039: checking for max_fail_percentage 28011 1726882554.92040: done checking for max_fail_percentage 28011 1726882554.92040: checking to see if all hosts have failed and the running result is not ok 28011 1726882554.92041: done checking to see if all hosts have failed 28011 1726882554.92042: getting the next task for host managed_node1 28011 1726882554.92045: done getting next task for host managed_node1 28011 1726882554.92046: ^ task is: None 28011 1726882554.92047: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882554.92311: in VariableManager get_vars() 28011 1726882554.92330: done with get_vars() 28011 1726882554.92337: in VariableManager get_vars() 28011 1726882554.92349: done with get_vars() 28011 1726882554.92353: variable 'omit' from source: magic vars 28011 1726882554.92451: variable 'profile' from source: play vars 28011 1726882554.92586: in VariableManager get_vars() 28011 1726882554.92605: done with get_vars() 28011 1726882554.92625: variable 'omit' from source: magic vars 28011 1726882554.92686: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 28011 1726882554.93368: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28011 1726882554.93391: getting the remaining hosts for this loop 28011 1726882554.93394: done getting the remaining hosts for this loop 28011 1726882554.93397: getting the next task for host managed_node1 28011 1726882554.93400: done getting next task for host managed_node1 28011 1726882554.93402: ^ task is: TASK: Gathering Facts 28011 1726882554.93403: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882554.93405: getting variables 28011 1726882554.93406: in VariableManager get_vars() 28011 1726882554.93417: Calling all_inventory to load vars for managed_node1 28011 1726882554.93420: Calling groups_inventory to load vars for managed_node1 28011 1726882554.93422: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882554.93427: Calling all_plugins_play to load vars for managed_node1 28011 1726882554.93429: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882554.93432: Calling groups_plugins_play to load vars for managed_node1 28011 1726882554.94586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882554.96980: done with get_vars() 28011 1726882554.97205: done getting variables 28011 1726882554.97259: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 21:35:54 -0400 (0:00:00.776) 0:00:24.524 ****** 28011 1726882554.97286: entering _queue_task() for managed_node1/gather_facts 28011 1726882554.98226: worker is 1 (out of 1 available) 28011 1726882554.98235: exiting _queue_task() for managed_node1/gather_facts 28011 1726882554.98244: done queuing things up, now waiting for results queue to drain 28011 1726882554.98246: waiting for pending results... 28011 1726882554.98714: running TaskExecutor() for managed_node1/TASK: Gathering Facts 28011 1726882554.98719: in run() - task 12673a56-9f93-962d-7c65-0000000006a2 28011 1726882554.98723: variable 'ansible_search_path' from source: unknown 28011 1726882554.98726: calling self._execute() 28011 1726882554.98998: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882554.99301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882554.99305: variable 'omit' from source: magic vars 28011 1726882554.99750: variable 'ansible_distribution_major_version' from source: facts 28011 1726882554.99923: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882554.99934: variable 'omit' from source: magic vars 28011 1726882554.99968: variable 'omit' from source: magic vars 28011 1726882555.00054: variable 'omit' from source: magic vars 28011 1726882555.00254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882555.00296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882555.00323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882555.00350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882555.00366: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882555.00484: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882555.00496: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882555.00505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882555.00714: Set connection var ansible_connection to ssh 28011 1726882555.00726: Set connection var ansible_pipelining to False 28011 1726882555.00734: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882555.00743: Set connection var ansible_shell_executable to /bin/sh 28011 1726882555.00756: Set connection var ansible_timeout to 10 28011 1726882555.00765: Set connection var ansible_shell_type to sh 28011 1726882555.00998: variable 'ansible_shell_executable' from source: unknown 28011 1726882555.01001: variable 'ansible_connection' from source: unknown 28011 1726882555.01003: variable 'ansible_module_compression' from source: unknown 28011 1726882555.01006: variable 'ansible_shell_type' from source: unknown 28011 1726882555.01008: variable 'ansible_shell_executable' from source: unknown 28011 1726882555.01010: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882555.01012: variable 'ansible_pipelining' from source: unknown 28011 1726882555.01014: variable 'ansible_timeout' from source: unknown 28011 1726882555.01016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882555.01339: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882555.01357: variable 'omit' from source: magic vars 28011 1726882555.01366: starting attempt loop 28011 1726882555.01373: running the handler 28011 1726882555.01398: variable 'ansible_facts' from source: unknown 28011 1726882555.01600: _low_level_execute_command(): starting 28011 1726882555.01604: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882555.02524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882555.02540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882555.02555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882555.02586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882555.02673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882555.02712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882555.02731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882555.02758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882555.02907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882555.04571: stdout chunk (state=3): >>>/root <<< 28011 1726882555.04672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882555.04710: stderr chunk (state=3): >>><<< 28011 1726882555.04726: stdout chunk (state=3): >>><<< 28011 1726882555.04961: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882555.04965: _low_level_execute_command(): starting 28011 1726882555.04971: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452 `" && echo ansible-tmp-1726882555.0486012-29162-230240734002452="` echo /root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452 `" ) && sleep 0' 28011 1726882555.06091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882555.06126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882555.06137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882555.06158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882555.06172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882555.06322: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882555.06391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882555.06426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882555.06429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882555.06534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882555.08371: stdout chunk (state=3): >>>ansible-tmp-1726882555.0486012-29162-230240734002452=/root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452 <<< 28011 1726882555.08513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882555.08528: stderr chunk (state=3): >>><<< 28011 1726882555.08558: stdout chunk (state=3): >>><<< 28011 1726882555.08698: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882555.0486012-29162-230240734002452=/root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882555.08702: variable 'ansible_module_compression' from source: unknown 28011 1726882555.08705: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28011 1726882555.08745: variable 'ansible_facts' from source: unknown 28011 1726882555.08967: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452/AnsiballZ_setup.py 28011 1726882555.09170: Sending initial data 28011 1726882555.09174: Sent initial data (154 bytes) 28011 1726882555.09740: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882555.09778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882555.09808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882555.09920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882555.09941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882555.10030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882555.11526: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882555.11557: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882555.11629: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmplat_gg6u /root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452/AnsiballZ_setup.py <<< 28011 1726882555.11633: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452/AnsiballZ_setup.py" <<< 28011 1726882555.11703: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmplat_gg6u" to remote "/root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452/AnsiballZ_setup.py" <<< 28011 1726882555.13828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882555.14001: stderr chunk (state=3): >>><<< 28011 1726882555.14006: stdout chunk (state=3): >>><<< 28011 1726882555.14008: done transferring module to remote 28011 1726882555.14011: _low_level_execute_command(): starting 28011 1726882555.14013: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452/ /root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452/AnsiballZ_setup.py && sleep 0' 28011 1726882555.14682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882555.14719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882555.14731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882555.14784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882555.16546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882555.16549: stdout chunk (state=3): >>><<< 28011 1726882555.16551: stderr chunk (state=3): >>><<< 28011 1726882555.16655: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882555.16658: _low_level_execute_command(): starting 28011 1726882555.16661: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452/AnsiballZ_setup.py && sleep 0' 28011 1726882555.17244: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882555.17260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882555.17275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882555.17499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882555.17622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882555.17694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882555.84106: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.4375, "5m": 0.38818359375, "15m": 0.2158203125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "55", "epoch": "1726882555", "epoch_int": "1726882555", "date": "2024-09-20", "time": "21:35:55", "iso8601_micro": "2024-09-21T01:35:55.448974Z", "iso8601": "2024-09-21T01:35:55Z", "iso8601_basic": "20240920T213555448974", "iso8601_basic_short": "20240920T213555", "tz": "EDT", "tz_dst": <<< 28011 1726882555.84135: stdout chunk (state=3): >>>"EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0", "ethtest0", "peerethtest0"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "4e:7e:e5:ed:a3:c7", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::4c7e:e5ff:feed:a3c7", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "22:94:82:4a:26:63", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "fe80::2094:82ff:fe4a:2663", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"<<< 28011 1726882555.84162: stdout chunk (state=3): >>>address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["198.51.100.3", "10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4c7e:e5ff:feed:a3c7", "fe80::2094:82ff:fe4a:2663", "fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "198.51.100.3"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223", "fe80::2094:82ff:fe4a:2663", "fe80::4c7e:e5ff:feed:a3c7"]}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2947, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 584, "free": 2947}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA<<< 28011 1726882555.84178: stdout chunk (state=3): >>>", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 988, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789839360, "block_size": 4096, "block_total": 65519099, "block_available": 63913535, "block_used": 1605564, "inode_total": 131070960, "inode_available": 131029044, "inode_used": 41916, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28011 1726882555.86078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882555.86107: stderr chunk (state=3): >>><<< 28011 1726882555.86110: stdout chunk (state=3): >>><<< 28011 1726882555.86150: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.4375, "5m": 0.38818359375, "15m": 0.2158203125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "55", "epoch": "1726882555", "epoch_int": "1726882555", "date": "2024-09-20", "time": "21:35:55", "iso8601_micro": "2024-09-21T01:35:55.448974Z", "iso8601": "2024-09-21T01:35:55Z", "iso8601_basic": "20240920T213555448974", "iso8601_basic_short": "20240920T213555", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0", "ethtest0", "peerethtest0"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "4e:7e:e5:ed:a3:c7", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::4c7e:e5ff:feed:a3c7", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "22:94:82:4a:26:63", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "fe80::2094:82ff:fe4a:2663", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["198.51.100.3", "10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4c7e:e5ff:feed:a3c7", "fe80::2094:82ff:fe4a:2663", "fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "198.51.100.3"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223", "fe80::2094:82ff:fe4a:2663", "fe80::4c7e:e5ff:feed:a3c7"]}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2947, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 584, "free": 2947}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 988, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789839360, "block_size": 4096, "block_total": 65519099, "block_available": 63913535, "block_used": 1605564, "inode_total": 131070960, "inode_available": 131029044, "inode_used": 41916, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882555.86437: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882555.86458: _low_level_execute_command(): starting 28011 1726882555.86461: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882555.0486012-29162-230240734002452/ > /dev/null 2>&1 && sleep 0' 28011 1726882555.87111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882555.87167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882555.87183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882555.87204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882555.87296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882555.89096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882555.89124: stderr chunk (state=3): >>><<< 28011 1726882555.89128: stdout chunk (state=3): >>><<< 28011 1726882555.89147: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882555.89153: handler run complete 28011 1726882555.89254: variable 'ansible_facts' from source: unknown 28011 1726882555.89334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882555.89549: variable 'ansible_facts' from source: unknown 28011 1726882555.89625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882555.89716: attempt loop complete, returning result 28011 1726882555.89720: _execute() done 28011 1726882555.89722: dumping result to json 28011 1726882555.89750: done dumping result, returning 28011 1726882555.89756: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-962d-7c65-0000000006a2] 28011 1726882555.89761: sending task result for task 12673a56-9f93-962d-7c65-0000000006a2 ok: [managed_node1] 28011 1726882555.90361: no more pending results, returning what we have 28011 1726882555.90364: results queue empty 28011 1726882555.90364: checking for any_errors_fatal 28011 1726882555.90365: done checking for any_errors_fatal 28011 1726882555.90366: checking for max_fail_percentage 28011 1726882555.90367: done checking for max_fail_percentage 28011 1726882555.90368: checking to see if all hosts have failed and the running result is not ok 28011 1726882555.90368: done checking to see if all hosts have failed 28011 1726882555.90368: getting the remaining hosts for this loop 28011 1726882555.90369: done getting the remaining hosts for this loop 28011 1726882555.90372: getting the next task for host managed_node1 28011 1726882555.90375: done getting next task for host managed_node1 28011 1726882555.90376: ^ task is: TASK: meta (flush_handlers) 28011 1726882555.90377: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882555.90380: getting variables 28011 1726882555.90381: in VariableManager get_vars() 28011 1726882555.90404: Calling all_inventory to load vars for managed_node1 28011 1726882555.90406: Calling groups_inventory to load vars for managed_node1 28011 1726882555.90407: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882555.90416: Calling all_plugins_play to load vars for managed_node1 28011 1726882555.90418: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882555.90423: Calling groups_plugins_play to load vars for managed_node1 28011 1726882555.90941: done sending task result for task 12673a56-9f93-962d-7c65-0000000006a2 28011 1726882555.90944: WORKER PROCESS EXITING 28011 1726882555.91168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882555.92064: done with get_vars() 28011 1726882555.92079: done getting variables 28011 1726882555.92129: in VariableManager get_vars() 28011 1726882555.92138: Calling all_inventory to load vars for managed_node1 28011 1726882555.92140: Calling groups_inventory to load vars for managed_node1 28011 1726882555.92141: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882555.92144: Calling all_plugins_play to load vars for managed_node1 28011 1726882555.92146: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882555.92147: Calling groups_plugins_play to load vars for managed_node1 28011 1726882555.92862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882555.93753: done with get_vars() 28011 1726882555.93772: done queuing things up, now waiting for results queue to drain 28011 1726882555.93774: results queue empty 28011 1726882555.93775: checking for any_errors_fatal 28011 1726882555.93778: done checking for any_errors_fatal 28011 1726882555.93779: checking for max_fail_percentage 28011 1726882555.93780: done checking for max_fail_percentage 28011 1726882555.93781: checking to see if all hosts have failed and the running result is not ok 28011 1726882555.93781: done checking to see if all hosts have failed 28011 1726882555.93786: getting the remaining hosts for this loop 28011 1726882555.93787: done getting the remaining hosts for this loop 28011 1726882555.93789: getting the next task for host managed_node1 28011 1726882555.93792: done getting next task for host managed_node1 28011 1726882555.93796: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28011 1726882555.93797: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882555.93804: getting variables 28011 1726882555.93805: in VariableManager get_vars() 28011 1726882555.93815: Calling all_inventory to load vars for managed_node1 28011 1726882555.93817: Calling groups_inventory to load vars for managed_node1 28011 1726882555.93818: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882555.93822: Calling all_plugins_play to load vars for managed_node1 28011 1726882555.93823: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882555.93825: Calling groups_plugins_play to load vars for managed_node1 28011 1726882555.94471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882555.95375: done with get_vars() 28011 1726882555.95390: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:35:55 -0400 (0:00:00.981) 0:00:25.505 ****** 28011 1726882555.95447: entering _queue_task() for managed_node1/include_tasks 28011 1726882555.95756: worker is 1 (out of 1 available) 28011 1726882555.95770: exiting _queue_task() for managed_node1/include_tasks 28011 1726882555.95780: done queuing things up, now waiting for results queue to drain 28011 1726882555.95782: waiting for pending results... 28011 1726882555.95958: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28011 1726882555.96043: in run() - task 12673a56-9f93-962d-7c65-0000000000b7 28011 1726882555.96054: variable 'ansible_search_path' from source: unknown 28011 1726882555.96057: variable 'ansible_search_path' from source: unknown 28011 1726882555.96086: calling self._execute() 28011 1726882555.96167: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882555.96171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882555.96180: variable 'omit' from source: magic vars 28011 1726882555.96459: variable 'ansible_distribution_major_version' from source: facts 28011 1726882555.96468: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882555.96474: _execute() done 28011 1726882555.96477: dumping result to json 28011 1726882555.96479: done dumping result, returning 28011 1726882555.96487: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-962d-7c65-0000000000b7] 28011 1726882555.96497: sending task result for task 12673a56-9f93-962d-7c65-0000000000b7 28011 1726882555.96577: done sending task result for task 12673a56-9f93-962d-7c65-0000000000b7 28011 1726882555.96580: WORKER PROCESS EXITING 28011 1726882555.96618: no more pending results, returning what we have 28011 1726882555.96623: in VariableManager get_vars() 28011 1726882555.96660: Calling all_inventory to load vars for managed_node1 28011 1726882555.96662: Calling groups_inventory to load vars for managed_node1 28011 1726882555.96665: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882555.96676: Calling all_plugins_play to load vars for managed_node1 28011 1726882555.96679: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882555.96681: Calling groups_plugins_play to load vars for managed_node1 28011 1726882555.97498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882555.98372: done with get_vars() 28011 1726882555.98384: variable 'ansible_search_path' from source: unknown 28011 1726882555.98385: variable 'ansible_search_path' from source: unknown 28011 1726882555.98406: we have included files to process 28011 1726882555.98407: generating all_blocks data 28011 1726882555.98408: done generating all_blocks data 28011 1726882555.98409: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28011 1726882555.98409: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28011 1726882555.98411: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28011 1726882555.98856: done processing included file 28011 1726882555.98858: iterating over new_blocks loaded from include file 28011 1726882555.98860: in VariableManager get_vars() 28011 1726882555.98878: done with get_vars() 28011 1726882555.98880: filtering new block on tags 28011 1726882555.98897: done filtering new block on tags 28011 1726882555.98900: in VariableManager get_vars() 28011 1726882555.98919: done with get_vars() 28011 1726882555.98921: filtering new block on tags 28011 1726882555.98938: done filtering new block on tags 28011 1726882555.98941: in VariableManager get_vars() 28011 1726882555.98959: done with get_vars() 28011 1726882555.98961: filtering new block on tags 28011 1726882555.98977: done filtering new block on tags 28011 1726882555.98979: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 28011 1726882555.98983: extending task lists for all hosts with included blocks 28011 1726882555.99335: done extending task lists 28011 1726882555.99336: done processing included files 28011 1726882555.99337: results queue empty 28011 1726882555.99338: checking for any_errors_fatal 28011 1726882555.99339: done checking for any_errors_fatal 28011 1726882555.99340: checking for max_fail_percentage 28011 1726882555.99341: done checking for max_fail_percentage 28011 1726882555.99342: checking to see if all hosts have failed and the running result is not ok 28011 1726882555.99343: done checking to see if all hosts have failed 28011 1726882555.99344: getting the remaining hosts for this loop 28011 1726882555.99345: done getting the remaining hosts for this loop 28011 1726882555.99347: getting the next task for host managed_node1 28011 1726882555.99351: done getting next task for host managed_node1 28011 1726882555.99354: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28011 1726882555.99356: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882555.99365: getting variables 28011 1726882555.99366: in VariableManager get_vars() 28011 1726882555.99380: Calling all_inventory to load vars for managed_node1 28011 1726882555.99382: Calling groups_inventory to load vars for managed_node1 28011 1726882555.99384: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882555.99389: Calling all_plugins_play to load vars for managed_node1 28011 1726882555.99391: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882555.99396: Calling groups_plugins_play to load vars for managed_node1 28011 1726882556.00579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882556.02117: done with get_vars() 28011 1726882556.02141: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:35:56 -0400 (0:00:00.067) 0:00:25.573 ****** 28011 1726882556.02220: entering _queue_task() for managed_node1/setup 28011 1726882556.02599: worker is 1 (out of 1 available) 28011 1726882556.02613: exiting _queue_task() for managed_node1/setup 28011 1726882556.02630: done queuing things up, now waiting for results queue to drain 28011 1726882556.02632: waiting for pending results... 28011 1726882556.02909: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28011 1726882556.03050: in run() - task 12673a56-9f93-962d-7c65-0000000006e3 28011 1726882556.03072: variable 'ansible_search_path' from source: unknown 28011 1726882556.03082: variable 'ansible_search_path' from source: unknown 28011 1726882556.03130: calling self._execute() 28011 1726882556.03238: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882556.03251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882556.03265: variable 'omit' from source: magic vars 28011 1726882556.03584: variable 'ansible_distribution_major_version' from source: facts 28011 1726882556.03597: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882556.03740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882556.05600: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882556.05642: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882556.05670: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882556.05699: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882556.05722: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882556.05779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882556.05803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882556.05824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882556.05850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882556.05860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882556.05899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882556.05921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882556.05936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882556.05960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882556.05970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882556.06079: variable '__network_required_facts' from source: role '' defaults 28011 1726882556.06086: variable 'ansible_facts' from source: unknown 28011 1726882556.06546: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28011 1726882556.06550: when evaluation is False, skipping this task 28011 1726882556.06552: _execute() done 28011 1726882556.06555: dumping result to json 28011 1726882556.06557: done dumping result, returning 28011 1726882556.06562: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-962d-7c65-0000000006e3] 28011 1726882556.06566: sending task result for task 12673a56-9f93-962d-7c65-0000000006e3 28011 1726882556.06649: done sending task result for task 12673a56-9f93-962d-7c65-0000000006e3 28011 1726882556.06652: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882556.06722: no more pending results, returning what we have 28011 1726882556.06725: results queue empty 28011 1726882556.06726: checking for any_errors_fatal 28011 1726882556.06728: done checking for any_errors_fatal 28011 1726882556.06729: checking for max_fail_percentage 28011 1726882556.06730: done checking for max_fail_percentage 28011 1726882556.06731: checking to see if all hosts have failed and the running result is not ok 28011 1726882556.06732: done checking to see if all hosts have failed 28011 1726882556.06733: getting the remaining hosts for this loop 28011 1726882556.06734: done getting the remaining hosts for this loop 28011 1726882556.06738: getting the next task for host managed_node1 28011 1726882556.06746: done getting next task for host managed_node1 28011 1726882556.06749: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28011 1726882556.06752: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882556.06765: getting variables 28011 1726882556.06767: in VariableManager get_vars() 28011 1726882556.06808: Calling all_inventory to load vars for managed_node1 28011 1726882556.06811: Calling groups_inventory to load vars for managed_node1 28011 1726882556.06813: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882556.06823: Calling all_plugins_play to load vars for managed_node1 28011 1726882556.06826: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882556.06828: Calling groups_plugins_play to load vars for managed_node1 28011 1726882556.08464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882556.10072: done with get_vars() 28011 1726882556.10096: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:35:56 -0400 (0:00:00.079) 0:00:25.653 ****** 28011 1726882556.10188: entering _queue_task() for managed_node1/stat 28011 1726882556.10515: worker is 1 (out of 1 available) 28011 1726882556.10526: exiting _queue_task() for managed_node1/stat 28011 1726882556.10535: done queuing things up, now waiting for results queue to drain 28011 1726882556.10537: waiting for pending results... 28011 1726882556.10895: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 28011 1726882556.10944: in run() - task 12673a56-9f93-962d-7c65-0000000006e5 28011 1726882556.10956: variable 'ansible_search_path' from source: unknown 28011 1726882556.10960: variable 'ansible_search_path' from source: unknown 28011 1726882556.11000: calling self._execute() 28011 1726882556.11103: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882556.11106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882556.11123: variable 'omit' from source: magic vars 28011 1726882556.11533: variable 'ansible_distribution_major_version' from source: facts 28011 1726882556.11537: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882556.11681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882556.11957: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882556.12005: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882556.12044: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882556.12106: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882556.12161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882556.12185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882556.12263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882556.12267: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882556.12325: variable '__network_is_ostree' from source: set_fact 28011 1726882556.12332: Evaluated conditional (not __network_is_ostree is defined): False 28011 1726882556.12335: when evaluation is False, skipping this task 28011 1726882556.12337: _execute() done 28011 1726882556.12340: dumping result to json 28011 1726882556.12342: done dumping result, returning 28011 1726882556.12351: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-962d-7c65-0000000006e5] 28011 1726882556.12356: sending task result for task 12673a56-9f93-962d-7c65-0000000006e5 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28011 1726882556.12494: no more pending results, returning what we have 28011 1726882556.12498: results queue empty 28011 1726882556.12499: checking for any_errors_fatal 28011 1726882556.12507: done checking for any_errors_fatal 28011 1726882556.12508: checking for max_fail_percentage 28011 1726882556.12509: done checking for max_fail_percentage 28011 1726882556.12511: checking to see if all hosts have failed and the running result is not ok 28011 1726882556.12511: done checking to see if all hosts have failed 28011 1726882556.12512: getting the remaining hosts for this loop 28011 1726882556.12514: done getting the remaining hosts for this loop 28011 1726882556.12517: getting the next task for host managed_node1 28011 1726882556.12524: done getting next task for host managed_node1 28011 1726882556.12528: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28011 1726882556.12531: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882556.12546: getting variables 28011 1726882556.12548: in VariableManager get_vars() 28011 1726882556.12585: Calling all_inventory to load vars for managed_node1 28011 1726882556.12588: Calling groups_inventory to load vars for managed_node1 28011 1726882556.12591: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882556.12604: Calling all_plugins_play to load vars for managed_node1 28011 1726882556.12608: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882556.12613: Calling groups_plugins_play to load vars for managed_node1 28011 1726882556.13139: done sending task result for task 12673a56-9f93-962d-7c65-0000000006e5 28011 1726882556.13142: WORKER PROCESS EXITING 28011 1726882556.14420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882556.15418: done with get_vars() 28011 1726882556.15432: done getting variables 28011 1726882556.15472: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:35:56 -0400 (0:00:00.053) 0:00:25.706 ****** 28011 1726882556.15496: entering _queue_task() for managed_node1/set_fact 28011 1726882556.15730: worker is 1 (out of 1 available) 28011 1726882556.15744: exiting _queue_task() for managed_node1/set_fact 28011 1726882556.15755: done queuing things up, now waiting for results queue to drain 28011 1726882556.15757: waiting for pending results... 28011 1726882556.15930: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28011 1726882556.16032: in run() - task 12673a56-9f93-962d-7c65-0000000006e6 28011 1726882556.16042: variable 'ansible_search_path' from source: unknown 28011 1726882556.16046: variable 'ansible_search_path' from source: unknown 28011 1726882556.16073: calling self._execute() 28011 1726882556.16148: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882556.16152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882556.16162: variable 'omit' from source: magic vars 28011 1726882556.16521: variable 'ansible_distribution_major_version' from source: facts 28011 1726882556.16525: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882556.16817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882556.16935: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882556.16978: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882556.17016: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882556.17050: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882556.17133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882556.17157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882556.17183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882556.17214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882556.17301: variable '__network_is_ostree' from source: set_fact 28011 1726882556.17308: Evaluated conditional (not __network_is_ostree is defined): False 28011 1726882556.17311: when evaluation is False, skipping this task 28011 1726882556.17314: _execute() done 28011 1726882556.17316: dumping result to json 28011 1726882556.17319: done dumping result, returning 28011 1726882556.17379: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-962d-7c65-0000000006e6] 28011 1726882556.17382: sending task result for task 12673a56-9f93-962d-7c65-0000000006e6 28011 1726882556.17439: done sending task result for task 12673a56-9f93-962d-7c65-0000000006e6 28011 1726882556.17441: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28011 1726882556.17508: no more pending results, returning what we have 28011 1726882556.17512: results queue empty 28011 1726882556.17512: checking for any_errors_fatal 28011 1726882556.17520: done checking for any_errors_fatal 28011 1726882556.17521: checking for max_fail_percentage 28011 1726882556.17523: done checking for max_fail_percentage 28011 1726882556.17524: checking to see if all hosts have failed and the running result is not ok 28011 1726882556.17525: done checking to see if all hosts have failed 28011 1726882556.17525: getting the remaining hosts for this loop 28011 1726882556.17526: done getting the remaining hosts for this loop 28011 1726882556.17530: getting the next task for host managed_node1 28011 1726882556.17539: done getting next task for host managed_node1 28011 1726882556.17542: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28011 1726882556.17545: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882556.17558: getting variables 28011 1726882556.17559: in VariableManager get_vars() 28011 1726882556.17591: Calling all_inventory to load vars for managed_node1 28011 1726882556.17637: Calling groups_inventory to load vars for managed_node1 28011 1726882556.17640: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882556.17649: Calling all_plugins_play to load vars for managed_node1 28011 1726882556.17651: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882556.17654: Calling groups_plugins_play to load vars for managed_node1 28011 1726882556.18514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882556.19522: done with get_vars() 28011 1726882556.19537: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:35:56 -0400 (0:00:00.041) 0:00:25.747 ****** 28011 1726882556.19618: entering _queue_task() for managed_node1/service_facts 28011 1726882556.19917: worker is 1 (out of 1 available) 28011 1726882556.19929: exiting _queue_task() for managed_node1/service_facts 28011 1726882556.19941: done queuing things up, now waiting for results queue to drain 28011 1726882556.19942: waiting for pending results... 28011 1726882556.20274: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 28011 1726882556.20342: in run() - task 12673a56-9f93-962d-7c65-0000000006e8 28011 1726882556.20354: variable 'ansible_search_path' from source: unknown 28011 1726882556.20358: variable 'ansible_search_path' from source: unknown 28011 1726882556.20399: calling self._execute() 28011 1726882556.20498: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882556.20502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882556.20560: variable 'omit' from source: magic vars 28011 1726882556.20887: variable 'ansible_distribution_major_version' from source: facts 28011 1726882556.20901: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882556.20907: variable 'omit' from source: magic vars 28011 1726882556.20973: variable 'omit' from source: magic vars 28011 1726882556.21010: variable 'omit' from source: magic vars 28011 1726882556.21051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882556.21089: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882556.21114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882556.21130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882556.21140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882556.21171: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882556.21179: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882556.21182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882556.21308: Set connection var ansible_connection to ssh 28011 1726882556.21318: Set connection var ansible_pipelining to False 28011 1726882556.21321: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882556.21324: Set connection var ansible_shell_executable to /bin/sh 28011 1726882556.21326: Set connection var ansible_timeout to 10 28011 1726882556.21329: Set connection var ansible_shell_type to sh 28011 1726882556.21347: variable 'ansible_shell_executable' from source: unknown 28011 1726882556.21350: variable 'ansible_connection' from source: unknown 28011 1726882556.21353: variable 'ansible_module_compression' from source: unknown 28011 1726882556.21355: variable 'ansible_shell_type' from source: unknown 28011 1726882556.21357: variable 'ansible_shell_executable' from source: unknown 28011 1726882556.21359: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882556.21363: variable 'ansible_pipelining' from source: unknown 28011 1726882556.21365: variable 'ansible_timeout' from source: unknown 28011 1726882556.21370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882556.21526: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882556.21537: variable 'omit' from source: magic vars 28011 1726882556.21540: starting attempt loop 28011 1726882556.21542: running the handler 28011 1726882556.21554: _low_level_execute_command(): starting 28011 1726882556.21561: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882556.22048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882556.22051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882556.22055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882556.22057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882556.22110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882556.22117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882556.22161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882556.23791: stdout chunk (state=3): >>>/root <<< 28011 1726882556.24037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882556.24041: stdout chunk (state=3): >>><<< 28011 1726882556.24043: stderr chunk (state=3): >>><<< 28011 1726882556.24047: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882556.24050: _low_level_execute_command(): starting 28011 1726882556.24052: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985 `" && echo ansible-tmp-1726882556.2395308-29217-97722793146985="` echo /root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985 `" ) && sleep 0' 28011 1726882556.24503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882556.24506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882556.24509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882556.24519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882556.24522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882556.24570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882556.24573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882556.24616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882556.26477: stdout chunk (state=3): >>>ansible-tmp-1726882556.2395308-29217-97722793146985=/root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985 <<< 28011 1726882556.26571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882556.26597: stderr chunk (state=3): >>><<< 28011 1726882556.26604: stdout chunk (state=3): >>><<< 28011 1726882556.26627: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882556.2395308-29217-97722793146985=/root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882556.26661: variable 'ansible_module_compression' from source: unknown 28011 1726882556.26705: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28011 1726882556.26735: variable 'ansible_facts' from source: unknown 28011 1726882556.26796: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985/AnsiballZ_service_facts.py 28011 1726882556.26895: Sending initial data 28011 1726882556.26899: Sent initial data (161 bytes) 28011 1726882556.27326: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882556.27329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882556.27332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882556.27334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882556.27336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882556.27382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882556.27386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882556.27430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882556.28934: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28011 1726882556.28943: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882556.28973: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882556.29023: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpl3zf128p /root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985/AnsiballZ_service_facts.py <<< 28011 1726882556.29026: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985/AnsiballZ_service_facts.py" <<< 28011 1726882556.29061: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpl3zf128p" to remote "/root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985/AnsiballZ_service_facts.py" <<< 28011 1726882556.29067: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985/AnsiballZ_service_facts.py" <<< 28011 1726882556.29579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882556.29621: stderr chunk (state=3): >>><<< 28011 1726882556.29624: stdout chunk (state=3): >>><<< 28011 1726882556.29647: done transferring module to remote 28011 1726882556.29658: _low_level_execute_command(): starting 28011 1726882556.29663: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985/ /root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985/AnsiballZ_service_facts.py && sleep 0' 28011 1726882556.30102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882556.30105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882556.30108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882556.30110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882556.30116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882556.30161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882556.30164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882556.30214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882556.31904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882556.31928: stderr chunk (state=3): >>><<< 28011 1726882556.31931: stdout chunk (state=3): >>><<< 28011 1726882556.31946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882556.31949: _low_level_execute_command(): starting 28011 1726882556.31956: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985/AnsiballZ_service_facts.py && sleep 0' 28011 1726882556.32378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882556.32382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882556.32405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882556.32454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882556.32459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882556.32461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882556.32513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882557.82472: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 28011 1726882557.82554: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28011 1726882557.84038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882557.84074: stderr chunk (state=3): >>><<< 28011 1726882557.84090: stdout chunk (state=3): >>><<< 28011 1726882557.84308: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882557.84829: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882557.84846: _low_level_execute_command(): starting 28011 1726882557.84856: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882556.2395308-29217-97722793146985/ > /dev/null 2>&1 && sleep 0' 28011 1726882557.85436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882557.85452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882557.85470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882557.85491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882557.85512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882557.85524: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882557.85609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882557.85625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882557.85643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882557.85656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882557.85728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882557.87507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882557.87560: stderr chunk (state=3): >>><<< 28011 1726882557.87572: stdout chunk (state=3): >>><<< 28011 1726882557.87589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882557.87602: handler run complete 28011 1726882557.87808: variable 'ansible_facts' from source: unknown 28011 1726882557.87957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882557.88427: variable 'ansible_facts' from source: unknown 28011 1726882557.88572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882557.88771: attempt loop complete, returning result 28011 1726882557.88781: _execute() done 28011 1726882557.88788: dumping result to json 28011 1726882557.88856: done dumping result, returning 28011 1726882557.88870: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-962d-7c65-0000000006e8] 28011 1726882557.88880: sending task result for task 12673a56-9f93-962d-7c65-0000000006e8 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882557.90089: no more pending results, returning what we have 28011 1726882557.90092: results queue empty 28011 1726882557.90098: checking for any_errors_fatal 28011 1726882557.90103: done checking for any_errors_fatal 28011 1726882557.90104: checking for max_fail_percentage 28011 1726882557.90105: done checking for max_fail_percentage 28011 1726882557.90106: checking to see if all hosts have failed and the running result is not ok 28011 1726882557.90107: done checking to see if all hosts have failed 28011 1726882557.90108: getting the remaining hosts for this loop 28011 1726882557.90109: done getting the remaining hosts for this loop 28011 1726882557.90113: getting the next task for host managed_node1 28011 1726882557.90117: done getting next task for host managed_node1 28011 1726882557.90121: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28011 1726882557.90123: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882557.90133: getting variables 28011 1726882557.90134: in VariableManager get_vars() 28011 1726882557.90164: Calling all_inventory to load vars for managed_node1 28011 1726882557.90166: Calling groups_inventory to load vars for managed_node1 28011 1726882557.90168: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882557.90177: Calling all_plugins_play to load vars for managed_node1 28011 1726882557.90179: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882557.90182: Calling groups_plugins_play to load vars for managed_node1 28011 1726882557.90708: done sending task result for task 12673a56-9f93-962d-7c65-0000000006e8 28011 1726882557.90712: WORKER PROCESS EXITING 28011 1726882557.91604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882557.93338: done with get_vars() 28011 1726882557.93366: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:35:57 -0400 (0:00:01.738) 0:00:27.485 ****** 28011 1726882557.93469: entering _queue_task() for managed_node1/package_facts 28011 1726882557.93850: worker is 1 (out of 1 available) 28011 1726882557.93863: exiting _queue_task() for managed_node1/package_facts 28011 1726882557.93875: done queuing things up, now waiting for results queue to drain 28011 1726882557.93877: waiting for pending results... 28011 1726882557.94170: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 28011 1726882557.94324: in run() - task 12673a56-9f93-962d-7c65-0000000006e9 28011 1726882557.94346: variable 'ansible_search_path' from source: unknown 28011 1726882557.94430: variable 'ansible_search_path' from source: unknown 28011 1726882557.94435: calling self._execute() 28011 1726882557.94505: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882557.94519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882557.94543: variable 'omit' from source: magic vars 28011 1726882557.94912: variable 'ansible_distribution_major_version' from source: facts 28011 1726882557.94929: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882557.94937: variable 'omit' from source: magic vars 28011 1726882557.94996: variable 'omit' from source: magic vars 28011 1726882557.95033: variable 'omit' from source: magic vars 28011 1726882557.95079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882557.95124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882557.95148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882557.95166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882557.95191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882557.95221: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882557.95302: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882557.95305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882557.95349: Set connection var ansible_connection to ssh 28011 1726882557.95364: Set connection var ansible_pipelining to False 28011 1726882557.95376: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882557.95387: Set connection var ansible_shell_executable to /bin/sh 28011 1726882557.95406: Set connection var ansible_timeout to 10 28011 1726882557.95420: Set connection var ansible_shell_type to sh 28011 1726882557.95449: variable 'ansible_shell_executable' from source: unknown 28011 1726882557.95457: variable 'ansible_connection' from source: unknown 28011 1726882557.95466: variable 'ansible_module_compression' from source: unknown 28011 1726882557.95473: variable 'ansible_shell_type' from source: unknown 28011 1726882557.95480: variable 'ansible_shell_executable' from source: unknown 28011 1726882557.95487: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882557.95519: variable 'ansible_pipelining' from source: unknown 28011 1726882557.95522: variable 'ansible_timeout' from source: unknown 28011 1726882557.95524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882557.95720: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882557.95797: variable 'omit' from source: magic vars 28011 1726882557.95800: starting attempt loop 28011 1726882557.95803: running the handler 28011 1726882557.95805: _low_level_execute_command(): starting 28011 1726882557.95807: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882557.96613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882557.96642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882557.96660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882557.96756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882557.98313: stdout chunk (state=3): >>>/root <<< 28011 1726882557.98469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882557.98472: stdout chunk (state=3): >>><<< 28011 1726882557.98475: stderr chunk (state=3): >>><<< 28011 1726882557.98588: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882557.98595: _low_level_execute_command(): starting 28011 1726882557.98599: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988 `" && echo ansible-tmp-1726882557.9850006-29279-228225345698988="` echo /root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988 `" ) && sleep 0' 28011 1726882557.99217: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882557.99244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882557.99263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882557.99287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882557.99359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882558.01206: stdout chunk (state=3): >>>ansible-tmp-1726882557.9850006-29279-228225345698988=/root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988 <<< 28011 1726882558.01339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882558.01351: stdout chunk (state=3): >>><<< 28011 1726882558.01369: stderr chunk (state=3): >>><<< 28011 1726882558.01508: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882557.9850006-29279-228225345698988=/root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882558.01512: variable 'ansible_module_compression' from source: unknown 28011 1726882558.01515: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28011 1726882558.01570: variable 'ansible_facts' from source: unknown 28011 1726882558.01774: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988/AnsiballZ_package_facts.py 28011 1726882558.01966: Sending initial data 28011 1726882558.01976: Sent initial data (162 bytes) 28011 1726882558.02583: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882558.02604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882558.02620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882558.02718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882558.02736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882558.02748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882558.02764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882558.02839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882558.04352: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882558.04400: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882558.04455: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmph2k04p82 /root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988/AnsiballZ_package_facts.py <<< 28011 1726882558.04458: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988/AnsiballZ_package_facts.py" <<< 28011 1726882558.04492: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmph2k04p82" to remote "/root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988/AnsiballZ_package_facts.py" <<< 28011 1726882558.06000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882558.06062: stderr chunk (state=3): >>><<< 28011 1726882558.06065: stdout chunk (state=3): >>><<< 28011 1726882558.06068: done transferring module to remote 28011 1726882558.06073: _low_level_execute_command(): starting 28011 1726882558.06082: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988/ /root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988/AnsiballZ_package_facts.py && sleep 0' 28011 1726882558.06702: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882558.06828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882558.06924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882558.07035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882558.07058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882558.07127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882558.09001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882558.09004: stdout chunk (state=3): >>><<< 28011 1726882558.09007: stderr chunk (state=3): >>><<< 28011 1726882558.09009: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882558.09013: _low_level_execute_command(): starting 28011 1726882558.09016: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988/AnsiballZ_package_facts.py && sleep 0' 28011 1726882558.10257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882558.10365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882558.10369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882558.10371: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882558.10373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882558.10376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882558.10384: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882558.10390: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28011 1726882558.10405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882558.10416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882558.10474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882558.10477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882558.10480: stderr chunk (state=3): >>>debug2: match found <<< 28011 1726882558.10482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882558.10631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882558.10822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882558.54477: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 28011 1726882558.54504: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 28011 1726882558.54520: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 28011 1726882558.54551: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 28011 1726882558.54564: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 28011 1726882558.54603: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 28011 1726882558.54627: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 28011 1726882558.54650: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 28011 1726882558.54668: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 28011 1726882558.54686: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 28011 1726882558.54701: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 28011 1726882558.54711: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28011 1726882558.56443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882558.56474: stderr chunk (state=3): >>><<< 28011 1726882558.56477: stdout chunk (state=3): >>><<< 28011 1726882558.56528: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882558.57709: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882558.57726: _low_level_execute_command(): starting 28011 1726882558.57731: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882557.9850006-29279-228225345698988/ > /dev/null 2>&1 && sleep 0' 28011 1726882558.58171: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882558.58198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882558.58202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882558.58256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882558.58259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882558.58262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882558.58314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882558.60121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882558.60151: stderr chunk (state=3): >>><<< 28011 1726882558.60154: stdout chunk (state=3): >>><<< 28011 1726882558.60165: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882558.60171: handler run complete 28011 1726882558.60677: variable 'ansible_facts' from source: unknown 28011 1726882558.60943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882558.61987: variable 'ansible_facts' from source: unknown 28011 1726882558.62227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882558.62613: attempt loop complete, returning result 28011 1726882558.62623: _execute() done 28011 1726882558.62626: dumping result to json 28011 1726882558.62741: done dumping result, returning 28011 1726882558.62749: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-962d-7c65-0000000006e9] 28011 1726882558.62752: sending task result for task 12673a56-9f93-962d-7c65-0000000006e9 28011 1726882558.64072: done sending task result for task 12673a56-9f93-962d-7c65-0000000006e9 28011 1726882558.64076: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882558.64169: no more pending results, returning what we have 28011 1726882558.64171: results queue empty 28011 1726882558.64171: checking for any_errors_fatal 28011 1726882558.64179: done checking for any_errors_fatal 28011 1726882558.64179: checking for max_fail_percentage 28011 1726882558.64180: done checking for max_fail_percentage 28011 1726882558.64181: checking to see if all hosts have failed and the running result is not ok 28011 1726882558.64181: done checking to see if all hosts have failed 28011 1726882558.64182: getting the remaining hosts for this loop 28011 1726882558.64183: done getting the remaining hosts for this loop 28011 1726882558.64186: getting the next task for host managed_node1 28011 1726882558.64191: done getting next task for host managed_node1 28011 1726882558.64195: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28011 1726882558.64196: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882558.64203: getting variables 28011 1726882558.64204: in VariableManager get_vars() 28011 1726882558.64226: Calling all_inventory to load vars for managed_node1 28011 1726882558.64228: Calling groups_inventory to load vars for managed_node1 28011 1726882558.64229: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882558.64235: Calling all_plugins_play to load vars for managed_node1 28011 1726882558.64237: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882558.64240: Calling groups_plugins_play to load vars for managed_node1 28011 1726882558.64931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882558.65811: done with get_vars() 28011 1726882558.65828: done getting variables 28011 1726882558.65872: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:35:58 -0400 (0:00:00.724) 0:00:28.210 ****** 28011 1726882558.65896: entering _queue_task() for managed_node1/debug 28011 1726882558.66130: worker is 1 (out of 1 available) 28011 1726882558.66142: exiting _queue_task() for managed_node1/debug 28011 1726882558.66153: done queuing things up, now waiting for results queue to drain 28011 1726882558.66155: waiting for pending results... 28011 1726882558.66327: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 28011 1726882558.66403: in run() - task 12673a56-9f93-962d-7c65-0000000000b8 28011 1726882558.66417: variable 'ansible_search_path' from source: unknown 28011 1726882558.66421: variable 'ansible_search_path' from source: unknown 28011 1726882558.66449: calling self._execute() 28011 1726882558.66528: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882558.66533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882558.66542: variable 'omit' from source: magic vars 28011 1726882558.66824: variable 'ansible_distribution_major_version' from source: facts 28011 1726882558.66833: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882558.66838: variable 'omit' from source: magic vars 28011 1726882558.66866: variable 'omit' from source: magic vars 28011 1726882558.66938: variable 'network_provider' from source: set_fact 28011 1726882558.66951: variable 'omit' from source: magic vars 28011 1726882558.66981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882558.67011: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882558.67027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882558.67043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882558.67052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882558.67077: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882558.67080: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882558.67084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882558.67155: Set connection var ansible_connection to ssh 28011 1726882558.67162: Set connection var ansible_pipelining to False 28011 1726882558.67168: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882558.67173: Set connection var ansible_shell_executable to /bin/sh 28011 1726882558.67180: Set connection var ansible_timeout to 10 28011 1726882558.67185: Set connection var ansible_shell_type to sh 28011 1726882558.67207: variable 'ansible_shell_executable' from source: unknown 28011 1726882558.67210: variable 'ansible_connection' from source: unknown 28011 1726882558.67213: variable 'ansible_module_compression' from source: unknown 28011 1726882558.67216: variable 'ansible_shell_type' from source: unknown 28011 1726882558.67218: variable 'ansible_shell_executable' from source: unknown 28011 1726882558.67220: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882558.67222: variable 'ansible_pipelining' from source: unknown 28011 1726882558.67226: variable 'ansible_timeout' from source: unknown 28011 1726882558.67230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882558.67333: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882558.67342: variable 'omit' from source: magic vars 28011 1726882558.67345: starting attempt loop 28011 1726882558.67348: running the handler 28011 1726882558.67385: handler run complete 28011 1726882558.67399: attempt loop complete, returning result 28011 1726882558.67403: _execute() done 28011 1726882558.67406: dumping result to json 28011 1726882558.67408: done dumping result, returning 28011 1726882558.67415: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-962d-7c65-0000000000b8] 28011 1726882558.67420: sending task result for task 12673a56-9f93-962d-7c65-0000000000b8 28011 1726882558.67498: done sending task result for task 12673a56-9f93-962d-7c65-0000000000b8 28011 1726882558.67501: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 28011 1726882558.67552: no more pending results, returning what we have 28011 1726882558.67556: results queue empty 28011 1726882558.67556: checking for any_errors_fatal 28011 1726882558.67566: done checking for any_errors_fatal 28011 1726882558.67567: checking for max_fail_percentage 28011 1726882558.67568: done checking for max_fail_percentage 28011 1726882558.67569: checking to see if all hosts have failed and the running result is not ok 28011 1726882558.67570: done checking to see if all hosts have failed 28011 1726882558.67571: getting the remaining hosts for this loop 28011 1726882558.67572: done getting the remaining hosts for this loop 28011 1726882558.67576: getting the next task for host managed_node1 28011 1726882558.67582: done getting next task for host managed_node1 28011 1726882558.67585: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28011 1726882558.67586: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882558.67598: getting variables 28011 1726882558.67599: in VariableManager get_vars() 28011 1726882558.67632: Calling all_inventory to load vars for managed_node1 28011 1726882558.67635: Calling groups_inventory to load vars for managed_node1 28011 1726882558.67637: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882558.67645: Calling all_plugins_play to load vars for managed_node1 28011 1726882558.67648: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882558.67650: Calling groups_plugins_play to load vars for managed_node1 28011 1726882558.68490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882558.69360: done with get_vars() 28011 1726882558.69377: done getting variables 28011 1726882558.69419: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:35:58 -0400 (0:00:00.035) 0:00:28.245 ****** 28011 1726882558.69442: entering _queue_task() for managed_node1/fail 28011 1726882558.69672: worker is 1 (out of 1 available) 28011 1726882558.69684: exiting _queue_task() for managed_node1/fail 28011 1726882558.69697: done queuing things up, now waiting for results queue to drain 28011 1726882558.69699: waiting for pending results... 28011 1726882558.69867: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28011 1726882558.69945: in run() - task 12673a56-9f93-962d-7c65-0000000000b9 28011 1726882558.69957: variable 'ansible_search_path' from source: unknown 28011 1726882558.69961: variable 'ansible_search_path' from source: unknown 28011 1726882558.69988: calling self._execute() 28011 1726882558.70068: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882558.70071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882558.70081: variable 'omit' from source: magic vars 28011 1726882558.70373: variable 'ansible_distribution_major_version' from source: facts 28011 1726882558.70383: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882558.70469: variable 'network_state' from source: role '' defaults 28011 1726882558.70480: Evaluated conditional (network_state != {}): False 28011 1726882558.70483: when evaluation is False, skipping this task 28011 1726882558.70486: _execute() done 28011 1726882558.70489: dumping result to json 28011 1726882558.70491: done dumping result, returning 28011 1726882558.70497: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-962d-7c65-0000000000b9] 28011 1726882558.70504: sending task result for task 12673a56-9f93-962d-7c65-0000000000b9 28011 1726882558.70584: done sending task result for task 12673a56-9f93-962d-7c65-0000000000b9 28011 1726882558.70587: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882558.70633: no more pending results, returning what we have 28011 1726882558.70637: results queue empty 28011 1726882558.70637: checking for any_errors_fatal 28011 1726882558.70647: done checking for any_errors_fatal 28011 1726882558.70648: checking for max_fail_percentage 28011 1726882558.70649: done checking for max_fail_percentage 28011 1726882558.70650: checking to see if all hosts have failed and the running result is not ok 28011 1726882558.70651: done checking to see if all hosts have failed 28011 1726882558.70652: getting the remaining hosts for this loop 28011 1726882558.70653: done getting the remaining hosts for this loop 28011 1726882558.70656: getting the next task for host managed_node1 28011 1726882558.70661: done getting next task for host managed_node1 28011 1726882558.70664: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28011 1726882558.70666: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882558.70680: getting variables 28011 1726882558.70681: in VariableManager get_vars() 28011 1726882558.70718: Calling all_inventory to load vars for managed_node1 28011 1726882558.70721: Calling groups_inventory to load vars for managed_node1 28011 1726882558.70723: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882558.70731: Calling all_plugins_play to load vars for managed_node1 28011 1726882558.70734: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882558.70736: Calling groups_plugins_play to load vars for managed_node1 28011 1726882558.71496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882558.72448: done with get_vars() 28011 1726882558.72463: done getting variables 28011 1726882558.72505: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:35:58 -0400 (0:00:00.030) 0:00:28.276 ****** 28011 1726882558.72526: entering _queue_task() for managed_node1/fail 28011 1726882558.72741: worker is 1 (out of 1 available) 28011 1726882558.72755: exiting _queue_task() for managed_node1/fail 28011 1726882558.72768: done queuing things up, now waiting for results queue to drain 28011 1726882558.72769: waiting for pending results... 28011 1726882558.72936: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28011 1726882558.73005: in run() - task 12673a56-9f93-962d-7c65-0000000000ba 28011 1726882558.73015: variable 'ansible_search_path' from source: unknown 28011 1726882558.73018: variable 'ansible_search_path' from source: unknown 28011 1726882558.73045: calling self._execute() 28011 1726882558.73123: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882558.73127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882558.73135: variable 'omit' from source: magic vars 28011 1726882558.73415: variable 'ansible_distribution_major_version' from source: facts 28011 1726882558.73428: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882558.73509: variable 'network_state' from source: role '' defaults 28011 1726882558.73517: Evaluated conditional (network_state != {}): False 28011 1726882558.73519: when evaluation is False, skipping this task 28011 1726882558.73522: _execute() done 28011 1726882558.73525: dumping result to json 28011 1726882558.73528: done dumping result, returning 28011 1726882558.73535: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-962d-7c65-0000000000ba] 28011 1726882558.73547: sending task result for task 12673a56-9f93-962d-7c65-0000000000ba 28011 1726882558.73632: done sending task result for task 12673a56-9f93-962d-7c65-0000000000ba 28011 1726882558.73635: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882558.73680: no more pending results, returning what we have 28011 1726882558.73684: results queue empty 28011 1726882558.73685: checking for any_errors_fatal 28011 1726882558.73696: done checking for any_errors_fatal 28011 1726882558.73697: checking for max_fail_percentage 28011 1726882558.73698: done checking for max_fail_percentage 28011 1726882558.73699: checking to see if all hosts have failed and the running result is not ok 28011 1726882558.73700: done checking to see if all hosts have failed 28011 1726882558.73700: getting the remaining hosts for this loop 28011 1726882558.73702: done getting the remaining hosts for this loop 28011 1726882558.73705: getting the next task for host managed_node1 28011 1726882558.73710: done getting next task for host managed_node1 28011 1726882558.73713: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28011 1726882558.73714: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882558.73727: getting variables 28011 1726882558.73729: in VariableManager get_vars() 28011 1726882558.73758: Calling all_inventory to load vars for managed_node1 28011 1726882558.73760: Calling groups_inventory to load vars for managed_node1 28011 1726882558.73762: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882558.73770: Calling all_plugins_play to load vars for managed_node1 28011 1726882558.73772: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882558.73774: Calling groups_plugins_play to load vars for managed_node1 28011 1726882558.77900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882558.78767: done with get_vars() 28011 1726882558.78782: done getting variables 28011 1726882558.78822: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:35:58 -0400 (0:00:00.063) 0:00:28.339 ****** 28011 1726882558.78840: entering _queue_task() for managed_node1/fail 28011 1726882558.79106: worker is 1 (out of 1 available) 28011 1726882558.79124: exiting _queue_task() for managed_node1/fail 28011 1726882558.79138: done queuing things up, now waiting for results queue to drain 28011 1726882558.79140: waiting for pending results... 28011 1726882558.79346: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28011 1726882558.79426: in run() - task 12673a56-9f93-962d-7c65-0000000000bb 28011 1726882558.79436: variable 'ansible_search_path' from source: unknown 28011 1726882558.79439: variable 'ansible_search_path' from source: unknown 28011 1726882558.79469: calling self._execute() 28011 1726882558.79546: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882558.79552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882558.79562: variable 'omit' from source: magic vars 28011 1726882558.79835: variable 'ansible_distribution_major_version' from source: facts 28011 1726882558.79844: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882558.79960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882558.81898: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882558.81902: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882558.81905: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882558.81908: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882558.81910: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882558.81966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882558.82003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882558.82036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882558.82080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882558.82104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882558.82202: variable 'ansible_distribution_major_version' from source: facts 28011 1726882558.82215: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28011 1726882558.82302: variable 'ansible_distribution' from source: facts 28011 1726882558.82309: variable '__network_rh_distros' from source: role '' defaults 28011 1726882558.82323: Evaluated conditional (ansible_distribution in __network_rh_distros): True 28011 1726882558.82480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882558.82502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882558.82520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882558.82545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882558.82556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882558.82590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882558.82610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882558.82627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882558.82651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882558.82662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882558.82695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882558.82712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882558.82730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882558.82754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882558.82764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882558.82954: variable 'network_connections' from source: play vars 28011 1726882558.82963: variable 'profile' from source: play vars 28011 1726882558.83015: variable 'profile' from source: play vars 28011 1726882558.83018: variable 'interface' from source: set_fact 28011 1726882558.83064: variable 'interface' from source: set_fact 28011 1726882558.83074: variable 'network_state' from source: role '' defaults 28011 1726882558.83122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882558.83371: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882558.83399: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882558.83421: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882558.83443: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882558.83478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882558.83495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882558.83513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882558.83530: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882558.83549: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 28011 1726882558.83552: when evaluation is False, skipping this task 28011 1726882558.83554: _execute() done 28011 1726882558.83557: dumping result to json 28011 1726882558.83559: done dumping result, returning 28011 1726882558.83566: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-962d-7c65-0000000000bb] 28011 1726882558.83573: sending task result for task 12673a56-9f93-962d-7c65-0000000000bb 28011 1726882558.83665: done sending task result for task 12673a56-9f93-962d-7c65-0000000000bb 28011 1726882558.83668: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 28011 1726882558.83728: no more pending results, returning what we have 28011 1726882558.83731: results queue empty 28011 1726882558.83732: checking for any_errors_fatal 28011 1726882558.83742: done checking for any_errors_fatal 28011 1726882558.83742: checking for max_fail_percentage 28011 1726882558.83744: done checking for max_fail_percentage 28011 1726882558.83745: checking to see if all hosts have failed and the running result is not ok 28011 1726882558.83746: done checking to see if all hosts have failed 28011 1726882558.83746: getting the remaining hosts for this loop 28011 1726882558.83748: done getting the remaining hosts for this loop 28011 1726882558.83751: getting the next task for host managed_node1 28011 1726882558.83756: done getting next task for host managed_node1 28011 1726882558.83760: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28011 1726882558.83761: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882558.83774: getting variables 28011 1726882558.83776: in VariableManager get_vars() 28011 1726882558.83816: Calling all_inventory to load vars for managed_node1 28011 1726882558.83818: Calling groups_inventory to load vars for managed_node1 28011 1726882558.83821: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882558.83831: Calling all_plugins_play to load vars for managed_node1 28011 1726882558.83833: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882558.83835: Calling groups_plugins_play to load vars for managed_node1 28011 1726882558.85170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882558.86749: done with get_vars() 28011 1726882558.86771: done getting variables 28011 1726882558.86830: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:35:58 -0400 (0:00:00.080) 0:00:28.419 ****** 28011 1726882558.86858: entering _queue_task() for managed_node1/dnf 28011 1726882558.87169: worker is 1 (out of 1 available) 28011 1726882558.87182: exiting _queue_task() for managed_node1/dnf 28011 1726882558.87397: done queuing things up, now waiting for results queue to drain 28011 1726882558.87399: waiting for pending results... 28011 1726882558.87527: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28011 1726882558.87581: in run() - task 12673a56-9f93-962d-7c65-0000000000bc 28011 1726882558.87603: variable 'ansible_search_path' from source: unknown 28011 1726882558.87611: variable 'ansible_search_path' from source: unknown 28011 1726882558.87655: calling self._execute() 28011 1726882558.87761: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882558.87772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882558.87784: variable 'omit' from source: magic vars 28011 1726882558.88154: variable 'ansible_distribution_major_version' from source: facts 28011 1726882558.88275: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882558.88375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882558.90567: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882558.90647: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882558.90692: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882558.90733: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882558.90765: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882558.90889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882558.90899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882558.90931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882558.90976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882558.91003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882558.91199: variable 'ansible_distribution' from source: facts 28011 1726882558.91203: variable 'ansible_distribution_major_version' from source: facts 28011 1726882558.91205: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28011 1726882558.91265: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882558.91403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882558.91437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882558.91468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882558.91513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882558.91533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882558.91576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882558.91602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882558.91629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882558.91671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882558.91686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882558.91728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882558.91754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882558.91797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882558.91822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882558.91838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882558.92020: variable 'network_connections' from source: play vars 28011 1726882558.92023: variable 'profile' from source: play vars 28011 1726882558.92072: variable 'profile' from source: play vars 28011 1726882558.92080: variable 'interface' from source: set_fact 28011 1726882558.92143: variable 'interface' from source: set_fact 28011 1726882558.92210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882558.92395: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882558.92455: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882558.92474: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882558.92511: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882558.92564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882558.92798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882558.92810: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882558.92812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882558.92816: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882558.92939: variable 'network_connections' from source: play vars 28011 1726882558.92950: variable 'profile' from source: play vars 28011 1726882558.93012: variable 'profile' from source: play vars 28011 1726882558.93022: variable 'interface' from source: set_fact 28011 1726882558.93086: variable 'interface' from source: set_fact 28011 1726882558.93118: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28011 1726882558.93126: when evaluation is False, skipping this task 28011 1726882558.93133: _execute() done 28011 1726882558.93143: dumping result to json 28011 1726882558.93154: done dumping result, returning 28011 1726882558.93165: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-962d-7c65-0000000000bc] 28011 1726882558.93176: sending task result for task 12673a56-9f93-962d-7c65-0000000000bc 28011 1726882558.93500: done sending task result for task 12673a56-9f93-962d-7c65-0000000000bc 28011 1726882558.93503: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28011 1726882558.93553: no more pending results, returning what we have 28011 1726882558.93556: results queue empty 28011 1726882558.93557: checking for any_errors_fatal 28011 1726882558.93563: done checking for any_errors_fatal 28011 1726882558.93564: checking for max_fail_percentage 28011 1726882558.93565: done checking for max_fail_percentage 28011 1726882558.93567: checking to see if all hosts have failed and the running result is not ok 28011 1726882558.93568: done checking to see if all hosts have failed 28011 1726882558.93568: getting the remaining hosts for this loop 28011 1726882558.93570: done getting the remaining hosts for this loop 28011 1726882558.93574: getting the next task for host managed_node1 28011 1726882558.93578: done getting next task for host managed_node1 28011 1726882558.93583: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28011 1726882558.93585: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882558.93600: getting variables 28011 1726882558.93602: in VariableManager get_vars() 28011 1726882558.93643: Calling all_inventory to load vars for managed_node1 28011 1726882558.93646: Calling groups_inventory to load vars for managed_node1 28011 1726882558.93648: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882558.93658: Calling all_plugins_play to load vars for managed_node1 28011 1726882558.93661: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882558.93665: Calling groups_plugins_play to load vars for managed_node1 28011 1726882558.95102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882558.96642: done with get_vars() 28011 1726882558.96667: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28011 1726882558.96746: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:35:58 -0400 (0:00:00.099) 0:00:28.519 ****** 28011 1726882558.96775: entering _queue_task() for managed_node1/yum 28011 1726882558.97119: worker is 1 (out of 1 available) 28011 1726882558.97131: exiting _queue_task() for managed_node1/yum 28011 1726882558.97143: done queuing things up, now waiting for results queue to drain 28011 1726882558.97145: waiting for pending results... 28011 1726882558.97419: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28011 1726882558.97540: in run() - task 12673a56-9f93-962d-7c65-0000000000bd 28011 1726882558.97559: variable 'ansible_search_path' from source: unknown 28011 1726882558.97567: variable 'ansible_search_path' from source: unknown 28011 1726882558.97608: calling self._execute() 28011 1726882558.97708: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882558.97722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882558.97738: variable 'omit' from source: magic vars 28011 1726882558.98107: variable 'ansible_distribution_major_version' from source: facts 28011 1726882558.98125: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882558.98304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882559.01000: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882559.01004: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882559.01006: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882559.01010: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882559.01012: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882559.01014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.01041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.01067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.01108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.01131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.01226: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.01249: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28011 1726882559.01257: when evaluation is False, skipping this task 28011 1726882559.01264: _execute() done 28011 1726882559.01271: dumping result to json 28011 1726882559.01278: done dumping result, returning 28011 1726882559.01291: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-962d-7c65-0000000000bd] 28011 1726882559.01302: sending task result for task 12673a56-9f93-962d-7c65-0000000000bd skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28011 1726882559.01466: no more pending results, returning what we have 28011 1726882559.01470: results queue empty 28011 1726882559.01471: checking for any_errors_fatal 28011 1726882559.01479: done checking for any_errors_fatal 28011 1726882559.01479: checking for max_fail_percentage 28011 1726882559.01482: done checking for max_fail_percentage 28011 1726882559.01483: checking to see if all hosts have failed and the running result is not ok 28011 1726882559.01484: done checking to see if all hosts have failed 28011 1726882559.01484: getting the remaining hosts for this loop 28011 1726882559.01486: done getting the remaining hosts for this loop 28011 1726882559.01490: getting the next task for host managed_node1 28011 1726882559.01498: done getting next task for host managed_node1 28011 1726882559.01503: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28011 1726882559.01505: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882559.01522: getting variables 28011 1726882559.01524: in VariableManager get_vars() 28011 1726882559.01567: Calling all_inventory to load vars for managed_node1 28011 1726882559.01570: Calling groups_inventory to load vars for managed_node1 28011 1726882559.01572: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882559.01583: Calling all_plugins_play to load vars for managed_node1 28011 1726882559.01587: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882559.01590: Calling groups_plugins_play to load vars for managed_node1 28011 1726882559.02406: done sending task result for task 12673a56-9f93-962d-7c65-0000000000bd 28011 1726882559.02409: WORKER PROCESS EXITING 28011 1726882559.03378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882559.04915: done with get_vars() 28011 1726882559.04939: done getting variables 28011 1726882559.05000: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:35:59 -0400 (0:00:00.082) 0:00:28.601 ****** 28011 1726882559.05031: entering _queue_task() for managed_node1/fail 28011 1726882559.05363: worker is 1 (out of 1 available) 28011 1726882559.05377: exiting _queue_task() for managed_node1/fail 28011 1726882559.05389: done queuing things up, now waiting for results queue to drain 28011 1726882559.05391: waiting for pending results... 28011 1726882559.05680: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28011 1726882559.05803: in run() - task 12673a56-9f93-962d-7c65-0000000000be 28011 1726882559.05828: variable 'ansible_search_path' from source: unknown 28011 1726882559.05838: variable 'ansible_search_path' from source: unknown 28011 1726882559.05877: calling self._execute() 28011 1726882559.05982: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882559.05996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882559.06015: variable 'omit' from source: magic vars 28011 1726882559.06404: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.06420: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882559.06531: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882559.06720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882559.08907: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882559.08985: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882559.09028: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882559.09066: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882559.09107: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882559.09198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.09236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.09265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.09316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.09335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.09383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.09415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.09443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.09486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.09513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.09556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.09582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.09614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.09900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.09904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.09906: variable 'network_connections' from source: play vars 28011 1726882559.09908: variable 'profile' from source: play vars 28011 1726882559.09941: variable 'profile' from source: play vars 28011 1726882559.09952: variable 'interface' from source: set_fact 28011 1726882559.10012: variable 'interface' from source: set_fact 28011 1726882559.10081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882559.10263: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882559.10309: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882559.10349: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882559.10383: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882559.10432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882559.10465: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882559.10498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.10530: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882559.10590: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882559.10843: variable 'network_connections' from source: play vars 28011 1726882559.10853: variable 'profile' from source: play vars 28011 1726882559.10923: variable 'profile' from source: play vars 28011 1726882559.10932: variable 'interface' from source: set_fact 28011 1726882559.11001: variable 'interface' from source: set_fact 28011 1726882559.11032: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28011 1726882559.11040: when evaluation is False, skipping this task 28011 1726882559.11048: _execute() done 28011 1726882559.11056: dumping result to json 28011 1726882559.11063: done dumping result, returning 28011 1726882559.11075: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-962d-7c65-0000000000be] 28011 1726882559.11096: sending task result for task 12673a56-9f93-962d-7c65-0000000000be skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28011 1726882559.11267: no more pending results, returning what we have 28011 1726882559.11271: results queue empty 28011 1726882559.11272: checking for any_errors_fatal 28011 1726882559.11279: done checking for any_errors_fatal 28011 1726882559.11280: checking for max_fail_percentage 28011 1726882559.11282: done checking for max_fail_percentage 28011 1726882559.11283: checking to see if all hosts have failed and the running result is not ok 28011 1726882559.11284: done checking to see if all hosts have failed 28011 1726882559.11285: getting the remaining hosts for this loop 28011 1726882559.11286: done getting the remaining hosts for this loop 28011 1726882559.11290: getting the next task for host managed_node1 28011 1726882559.11298: done getting next task for host managed_node1 28011 1726882559.11302: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28011 1726882559.11304: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882559.11319: getting variables 28011 1726882559.11321: in VariableManager get_vars() 28011 1726882559.11361: Calling all_inventory to load vars for managed_node1 28011 1726882559.11364: Calling groups_inventory to load vars for managed_node1 28011 1726882559.11367: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882559.11379: Calling all_plugins_play to load vars for managed_node1 28011 1726882559.11382: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882559.11386: Calling groups_plugins_play to load vars for managed_node1 28011 1726882559.12266: done sending task result for task 12673a56-9f93-962d-7c65-0000000000be 28011 1726882559.12270: WORKER PROCESS EXITING 28011 1726882559.13399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882559.14998: done with get_vars() 28011 1726882559.15024: done getting variables 28011 1726882559.15121: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:35:59 -0400 (0:00:00.101) 0:00:28.702 ****** 28011 1726882559.15152: entering _queue_task() for managed_node1/package 28011 1726882559.16174: worker is 1 (out of 1 available) 28011 1726882559.16187: exiting _queue_task() for managed_node1/package 28011 1726882559.16202: done queuing things up, now waiting for results queue to drain 28011 1726882559.16204: waiting for pending results... 28011 1726882559.16737: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 28011 1726882559.17109: in run() - task 12673a56-9f93-962d-7c65-0000000000bf 28011 1726882559.17130: variable 'ansible_search_path' from source: unknown 28011 1726882559.17166: variable 'ansible_search_path' from source: unknown 28011 1726882559.17430: calling self._execute() 28011 1726882559.17598: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882559.17601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882559.17716: variable 'omit' from source: magic vars 28011 1726882559.18592: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.18744: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882559.19273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882559.19713: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882559.19769: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882559.19856: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882559.20695: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882559.20934: variable 'network_packages' from source: role '' defaults 28011 1726882559.21266: variable '__network_provider_setup' from source: role '' defaults 28011 1726882559.21285: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882559.21421: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882559.21435: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882559.21578: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882559.22011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882559.27110: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882559.27114: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882559.27201: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882559.27246: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882559.27345: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882559.27448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.27514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.27557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.27605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.27626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.27682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.27715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.27742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.27791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.27813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.28046: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28011 1726882559.28174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.28211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.28238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.28285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.28377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.28414: variable 'ansible_python' from source: facts 28011 1726882559.28443: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28011 1726882559.28544: variable '__network_wpa_supplicant_required' from source: role '' defaults 28011 1726882559.28637: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28011 1726882559.28771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.28804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.28843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.28886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.28910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.29060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.29072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.29074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.29077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.29101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.29284: variable 'network_connections' from source: play vars 28011 1726882559.29302: variable 'profile' from source: play vars 28011 1726882559.29429: variable 'profile' from source: play vars 28011 1726882559.29440: variable 'interface' from source: set_fact 28011 1726882559.29508: variable 'interface' from source: set_fact 28011 1726882559.29636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882559.29654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882559.29682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.29741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882559.29782: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882559.30371: variable 'network_connections' from source: play vars 28011 1726882559.30403: variable 'profile' from source: play vars 28011 1726882559.30646: variable 'profile' from source: play vars 28011 1726882559.30655: variable 'interface' from source: set_fact 28011 1726882559.30756: variable 'interface' from source: set_fact 28011 1726882559.30789: variable '__network_packages_default_wireless' from source: role '' defaults 28011 1726882559.30883: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882559.31214: variable 'network_connections' from source: play vars 28011 1726882559.31218: variable 'profile' from source: play vars 28011 1726882559.31286: variable 'profile' from source: play vars 28011 1726882559.31346: variable 'interface' from source: set_fact 28011 1726882559.31398: variable 'interface' from source: set_fact 28011 1726882559.31422: variable '__network_packages_default_team' from source: role '' defaults 28011 1726882559.31507: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882559.31827: variable 'network_connections' from source: play vars 28011 1726882559.31830: variable 'profile' from source: play vars 28011 1726882559.31901: variable 'profile' from source: play vars 28011 1726882559.31904: variable 'interface' from source: set_fact 28011 1726882559.32098: variable 'interface' from source: set_fact 28011 1726882559.32101: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882559.32122: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882559.32130: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882559.32186: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882559.32422: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28011 1726882559.33425: variable 'network_connections' from source: play vars 28011 1726882559.33428: variable 'profile' from source: play vars 28011 1726882559.33523: variable 'profile' from source: play vars 28011 1726882559.33530: variable 'interface' from source: set_fact 28011 1726882559.33631: variable 'interface' from source: set_fact 28011 1726882559.33634: variable 'ansible_distribution' from source: facts 28011 1726882559.33638: variable '__network_rh_distros' from source: role '' defaults 28011 1726882559.33640: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.33643: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28011 1726882559.33846: variable 'ansible_distribution' from source: facts 28011 1726882559.33849: variable '__network_rh_distros' from source: role '' defaults 28011 1726882559.33852: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.33854: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28011 1726882559.34321: variable 'ansible_distribution' from source: facts 28011 1726882559.34326: variable '__network_rh_distros' from source: role '' defaults 28011 1726882559.34332: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.34370: variable 'network_provider' from source: set_fact 28011 1726882559.34398: variable 'ansible_facts' from source: unknown 28011 1726882559.35589: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28011 1726882559.35595: when evaluation is False, skipping this task 28011 1726882559.35597: _execute() done 28011 1726882559.35599: dumping result to json 28011 1726882559.35601: done dumping result, returning 28011 1726882559.35609: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-962d-7c65-0000000000bf] 28011 1726882559.35612: sending task result for task 12673a56-9f93-962d-7c65-0000000000bf 28011 1726882559.35713: done sending task result for task 12673a56-9f93-962d-7c65-0000000000bf 28011 1726882559.35716: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28011 1726882559.35764: no more pending results, returning what we have 28011 1726882559.35767: results queue empty 28011 1726882559.35768: checking for any_errors_fatal 28011 1726882559.35774: done checking for any_errors_fatal 28011 1726882559.35774: checking for max_fail_percentage 28011 1726882559.35776: done checking for max_fail_percentage 28011 1726882559.35777: checking to see if all hosts have failed and the running result is not ok 28011 1726882559.35777: done checking to see if all hosts have failed 28011 1726882559.35778: getting the remaining hosts for this loop 28011 1726882559.35779: done getting the remaining hosts for this loop 28011 1726882559.35783: getting the next task for host managed_node1 28011 1726882559.35791: done getting next task for host managed_node1 28011 1726882559.35796: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28011 1726882559.35798: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882559.35811: getting variables 28011 1726882559.35813: in VariableManager get_vars() 28011 1726882559.35849: Calling all_inventory to load vars for managed_node1 28011 1726882559.35852: Calling groups_inventory to load vars for managed_node1 28011 1726882559.35854: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882559.35869: Calling all_plugins_play to load vars for managed_node1 28011 1726882559.35871: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882559.35874: Calling groups_plugins_play to load vars for managed_node1 28011 1726882559.36811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882559.38584: done with get_vars() 28011 1726882559.38629: done getting variables 28011 1726882559.38721: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:35:59 -0400 (0:00:00.235) 0:00:28.938 ****** 28011 1726882559.38752: entering _queue_task() for managed_node1/package 28011 1726882559.39204: worker is 1 (out of 1 available) 28011 1726882559.39216: exiting _queue_task() for managed_node1/package 28011 1726882559.39342: done queuing things up, now waiting for results queue to drain 28011 1726882559.39344: waiting for pending results... 28011 1726882559.39647: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28011 1726882559.39664: in run() - task 12673a56-9f93-962d-7c65-0000000000c0 28011 1726882559.39711: variable 'ansible_search_path' from source: unknown 28011 1726882559.39715: variable 'ansible_search_path' from source: unknown 28011 1726882559.39728: calling self._execute() 28011 1726882559.39814: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882559.39824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882559.39833: variable 'omit' from source: magic vars 28011 1726882559.40128: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.40138: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882559.40222: variable 'network_state' from source: role '' defaults 28011 1726882559.40231: Evaluated conditional (network_state != {}): False 28011 1726882559.40235: when evaluation is False, skipping this task 28011 1726882559.40237: _execute() done 28011 1726882559.40240: dumping result to json 28011 1726882559.40242: done dumping result, returning 28011 1726882559.40248: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-962d-7c65-0000000000c0] 28011 1726882559.40254: sending task result for task 12673a56-9f93-962d-7c65-0000000000c0 28011 1726882559.40339: done sending task result for task 12673a56-9f93-962d-7c65-0000000000c0 28011 1726882559.40342: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882559.40387: no more pending results, returning what we have 28011 1726882559.40391: results queue empty 28011 1726882559.40391: checking for any_errors_fatal 28011 1726882559.40401: done checking for any_errors_fatal 28011 1726882559.40402: checking for max_fail_percentage 28011 1726882559.40403: done checking for max_fail_percentage 28011 1726882559.40404: checking to see if all hosts have failed and the running result is not ok 28011 1726882559.40405: done checking to see if all hosts have failed 28011 1726882559.40406: getting the remaining hosts for this loop 28011 1726882559.40407: done getting the remaining hosts for this loop 28011 1726882559.40411: getting the next task for host managed_node1 28011 1726882559.40416: done getting next task for host managed_node1 28011 1726882559.40420: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28011 1726882559.40422: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882559.40437: getting variables 28011 1726882559.40438: in VariableManager get_vars() 28011 1726882559.40481: Calling all_inventory to load vars for managed_node1 28011 1726882559.40483: Calling groups_inventory to load vars for managed_node1 28011 1726882559.40485: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882559.40496: Calling all_plugins_play to load vars for managed_node1 28011 1726882559.40499: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882559.40501: Calling groups_plugins_play to load vars for managed_node1 28011 1726882559.41652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882559.43435: done with get_vars() 28011 1726882559.43450: done getting variables 28011 1726882559.43494: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:35:59 -0400 (0:00:00.047) 0:00:28.986 ****** 28011 1726882559.43518: entering _queue_task() for managed_node1/package 28011 1726882559.43746: worker is 1 (out of 1 available) 28011 1726882559.43760: exiting _queue_task() for managed_node1/package 28011 1726882559.43771: done queuing things up, now waiting for results queue to drain 28011 1726882559.43773: waiting for pending results... 28011 1726882559.43940: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28011 1726882559.44015: in run() - task 12673a56-9f93-962d-7c65-0000000000c1 28011 1726882559.44027: variable 'ansible_search_path' from source: unknown 28011 1726882559.44030: variable 'ansible_search_path' from source: unknown 28011 1726882559.44058: calling self._execute() 28011 1726882559.44134: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882559.44138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882559.44147: variable 'omit' from source: magic vars 28011 1726882559.44417: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.44426: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882559.44512: variable 'network_state' from source: role '' defaults 28011 1726882559.44520: Evaluated conditional (network_state != {}): False 28011 1726882559.44523: when evaluation is False, skipping this task 28011 1726882559.44526: _execute() done 28011 1726882559.44529: dumping result to json 28011 1726882559.44531: done dumping result, returning 28011 1726882559.44539: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-962d-7c65-0000000000c1] 28011 1726882559.44542: sending task result for task 12673a56-9f93-962d-7c65-0000000000c1 28011 1726882559.44631: done sending task result for task 12673a56-9f93-962d-7c65-0000000000c1 28011 1726882559.44634: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882559.44698: no more pending results, returning what we have 28011 1726882559.44702: results queue empty 28011 1726882559.44703: checking for any_errors_fatal 28011 1726882559.44710: done checking for any_errors_fatal 28011 1726882559.44710: checking for max_fail_percentage 28011 1726882559.44712: done checking for max_fail_percentage 28011 1726882559.44713: checking to see if all hosts have failed and the running result is not ok 28011 1726882559.44714: done checking to see if all hosts have failed 28011 1726882559.44714: getting the remaining hosts for this loop 28011 1726882559.44716: done getting the remaining hosts for this loop 28011 1726882559.44719: getting the next task for host managed_node1 28011 1726882559.44723: done getting next task for host managed_node1 28011 1726882559.44730: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28011 1726882559.44732: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882559.44746: getting variables 28011 1726882559.44748: in VariableManager get_vars() 28011 1726882559.44777: Calling all_inventory to load vars for managed_node1 28011 1726882559.44779: Calling groups_inventory to load vars for managed_node1 28011 1726882559.44781: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882559.44791: Calling all_plugins_play to load vars for managed_node1 28011 1726882559.44796: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882559.44798: Calling groups_plugins_play to load vars for managed_node1 28011 1726882559.46123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882559.47570: done with get_vars() 28011 1726882559.47585: done getting variables 28011 1726882559.47631: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:35:59 -0400 (0:00:00.041) 0:00:29.027 ****** 28011 1726882559.47653: entering _queue_task() for managed_node1/service 28011 1726882559.47886: worker is 1 (out of 1 available) 28011 1726882559.47900: exiting _queue_task() for managed_node1/service 28011 1726882559.47913: done queuing things up, now waiting for results queue to drain 28011 1726882559.47915: waiting for pending results... 28011 1726882559.48090: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28011 1726882559.48169: in run() - task 12673a56-9f93-962d-7c65-0000000000c2 28011 1726882559.48181: variable 'ansible_search_path' from source: unknown 28011 1726882559.48184: variable 'ansible_search_path' from source: unknown 28011 1726882559.48216: calling self._execute() 28011 1726882559.48298: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882559.48302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882559.48312: variable 'omit' from source: magic vars 28011 1726882559.48583: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.48597: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882559.48680: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882559.48810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882559.50627: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882559.50917: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882559.50944: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882559.50968: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882559.50988: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882559.51053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.51073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.51095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.51123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.51198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.51201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.51203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.51206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.51222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.51233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.51265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.51281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.51300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.51325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.51335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.51445: variable 'network_connections' from source: play vars 28011 1726882559.51462: variable 'profile' from source: play vars 28011 1726882559.51509: variable 'profile' from source: play vars 28011 1726882559.51513: variable 'interface' from source: set_fact 28011 1726882559.51555: variable 'interface' from source: set_fact 28011 1726882559.51608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882559.51714: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882559.51740: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882559.51762: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882559.51797: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882559.51827: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882559.51841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882559.51858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.51875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882559.51914: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882559.52061: variable 'network_connections' from source: play vars 28011 1726882559.52064: variable 'profile' from source: play vars 28011 1726882559.52111: variable 'profile' from source: play vars 28011 1726882559.52114: variable 'interface' from source: set_fact 28011 1726882559.52156: variable 'interface' from source: set_fact 28011 1726882559.52173: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28011 1726882559.52177: when evaluation is False, skipping this task 28011 1726882559.52179: _execute() done 28011 1726882559.52182: dumping result to json 28011 1726882559.52184: done dumping result, returning 28011 1726882559.52190: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-962d-7c65-0000000000c2] 28011 1726882559.52204: sending task result for task 12673a56-9f93-962d-7c65-0000000000c2 28011 1726882559.52289: done sending task result for task 12673a56-9f93-962d-7c65-0000000000c2 28011 1726882559.52292: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28011 1726882559.52359: no more pending results, returning what we have 28011 1726882559.52363: results queue empty 28011 1726882559.52364: checking for any_errors_fatal 28011 1726882559.52369: done checking for any_errors_fatal 28011 1726882559.52370: checking for max_fail_percentage 28011 1726882559.52371: done checking for max_fail_percentage 28011 1726882559.52372: checking to see if all hosts have failed and the running result is not ok 28011 1726882559.52373: done checking to see if all hosts have failed 28011 1726882559.52374: getting the remaining hosts for this loop 28011 1726882559.52375: done getting the remaining hosts for this loop 28011 1726882559.52379: getting the next task for host managed_node1 28011 1726882559.52383: done getting next task for host managed_node1 28011 1726882559.52386: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28011 1726882559.52388: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882559.52403: getting variables 28011 1726882559.52405: in VariableManager get_vars() 28011 1726882559.52440: Calling all_inventory to load vars for managed_node1 28011 1726882559.52443: Calling groups_inventory to load vars for managed_node1 28011 1726882559.52445: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882559.52453: Calling all_plugins_play to load vars for managed_node1 28011 1726882559.52455: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882559.52458: Calling groups_plugins_play to load vars for managed_node1 28011 1726882559.53362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882559.54232: done with get_vars() 28011 1726882559.54246: done getting variables 28011 1726882559.54285: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:35:59 -0400 (0:00:00.066) 0:00:29.094 ****** 28011 1726882559.54308: entering _queue_task() for managed_node1/service 28011 1726882559.54526: worker is 1 (out of 1 available) 28011 1726882559.54538: exiting _queue_task() for managed_node1/service 28011 1726882559.54549: done queuing things up, now waiting for results queue to drain 28011 1726882559.54551: waiting for pending results... 28011 1726882559.54714: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28011 1726882559.54784: in run() - task 12673a56-9f93-962d-7c65-0000000000c3 28011 1726882559.54799: variable 'ansible_search_path' from source: unknown 28011 1726882559.54803: variable 'ansible_search_path' from source: unknown 28011 1726882559.54831: calling self._execute() 28011 1726882559.54909: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882559.54913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882559.54922: variable 'omit' from source: magic vars 28011 1726882559.55179: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.55188: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882559.55298: variable 'network_provider' from source: set_fact 28011 1726882559.55302: variable 'network_state' from source: role '' defaults 28011 1726882559.55312: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28011 1726882559.55319: variable 'omit' from source: magic vars 28011 1726882559.55348: variable 'omit' from source: magic vars 28011 1726882559.55368: variable 'network_service_name' from source: role '' defaults 28011 1726882559.55421: variable 'network_service_name' from source: role '' defaults 28011 1726882559.55496: variable '__network_provider_setup' from source: role '' defaults 28011 1726882559.55501: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882559.55548: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882559.55552: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882559.55599: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882559.55741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882559.57161: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882559.57220: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882559.57248: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882559.57272: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882559.57298: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882559.57355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.57375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.57399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.57426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.57437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.57469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.57484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.57508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.57533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.57543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.57689: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28011 1726882559.57766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.57783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.57805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.57832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.57843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.57906: variable 'ansible_python' from source: facts 28011 1726882559.57923: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28011 1726882559.57980: variable '__network_wpa_supplicant_required' from source: role '' defaults 28011 1726882559.58037: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28011 1726882559.58120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.58136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.58157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.58179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.58192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.58225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882559.58245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882559.58261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.58296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882559.58307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882559.58399: variable 'network_connections' from source: play vars 28011 1726882559.58406: variable 'profile' from source: play vars 28011 1726882559.58456: variable 'profile' from source: play vars 28011 1726882559.58461: variable 'interface' from source: set_fact 28011 1726882559.58509: variable 'interface' from source: set_fact 28011 1726882559.58576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882559.58707: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882559.58741: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882559.58771: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882559.58804: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882559.58847: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882559.58867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882559.58889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882559.58917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882559.58952: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882559.59127: variable 'network_connections' from source: play vars 28011 1726882559.59132: variable 'profile' from source: play vars 28011 1726882559.59186: variable 'profile' from source: play vars 28011 1726882559.59195: variable 'interface' from source: set_fact 28011 1726882559.59237: variable 'interface' from source: set_fact 28011 1726882559.59263: variable '__network_packages_default_wireless' from source: role '' defaults 28011 1726882559.59319: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882559.59503: variable 'network_connections' from source: play vars 28011 1726882559.59506: variable 'profile' from source: play vars 28011 1726882559.59555: variable 'profile' from source: play vars 28011 1726882559.59558: variable 'interface' from source: set_fact 28011 1726882559.59616: variable 'interface' from source: set_fact 28011 1726882559.59635: variable '__network_packages_default_team' from source: role '' defaults 28011 1726882559.59690: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882559.59873: variable 'network_connections' from source: play vars 28011 1726882559.59876: variable 'profile' from source: play vars 28011 1726882559.59930: variable 'profile' from source: play vars 28011 1726882559.59933: variable 'interface' from source: set_fact 28011 1726882559.59982: variable 'interface' from source: set_fact 28011 1726882559.60097: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882559.60100: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882559.60101: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882559.60115: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882559.60257: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28011 1726882559.60590: variable 'network_connections' from source: play vars 28011 1726882559.60595: variable 'profile' from source: play vars 28011 1726882559.60635: variable 'profile' from source: play vars 28011 1726882559.60638: variable 'interface' from source: set_fact 28011 1726882559.60691: variable 'interface' from source: set_fact 28011 1726882559.60698: variable 'ansible_distribution' from source: facts 28011 1726882559.60700: variable '__network_rh_distros' from source: role '' defaults 28011 1726882559.60708: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.60718: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28011 1726882559.60830: variable 'ansible_distribution' from source: facts 28011 1726882559.60834: variable '__network_rh_distros' from source: role '' defaults 28011 1726882559.60837: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.60848: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28011 1726882559.60958: variable 'ansible_distribution' from source: facts 28011 1726882559.60961: variable '__network_rh_distros' from source: role '' defaults 28011 1726882559.60965: variable 'ansible_distribution_major_version' from source: facts 28011 1726882559.60997: variable 'network_provider' from source: set_fact 28011 1726882559.61014: variable 'omit' from source: magic vars 28011 1726882559.61033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882559.61053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882559.61068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882559.61081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882559.61094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882559.61117: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882559.61120: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882559.61122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882559.61184: Set connection var ansible_connection to ssh 28011 1726882559.61191: Set connection var ansible_pipelining to False 28011 1726882559.61198: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882559.61205: Set connection var ansible_shell_executable to /bin/sh 28011 1726882559.61212: Set connection var ansible_timeout to 10 28011 1726882559.61217: Set connection var ansible_shell_type to sh 28011 1726882559.61237: variable 'ansible_shell_executable' from source: unknown 28011 1726882559.61240: variable 'ansible_connection' from source: unknown 28011 1726882559.61242: variable 'ansible_module_compression' from source: unknown 28011 1726882559.61244: variable 'ansible_shell_type' from source: unknown 28011 1726882559.61246: variable 'ansible_shell_executable' from source: unknown 28011 1726882559.61248: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882559.61255: variable 'ansible_pipelining' from source: unknown 28011 1726882559.61257: variable 'ansible_timeout' from source: unknown 28011 1726882559.61259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882559.61333: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882559.61342: variable 'omit' from source: magic vars 28011 1726882559.61345: starting attempt loop 28011 1726882559.61348: running the handler 28011 1726882559.61404: variable 'ansible_facts' from source: unknown 28011 1726882559.61873: _low_level_execute_command(): starting 28011 1726882559.61876: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882559.62375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882559.62379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882559.62382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882559.62384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882559.62389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882559.62439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882559.62444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882559.62446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882559.62504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882559.64180: stdout chunk (state=3): >>>/root <<< 28011 1726882559.64274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882559.64308: stderr chunk (state=3): >>><<< 28011 1726882559.64311: stdout chunk (state=3): >>><<< 28011 1726882559.64328: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882559.64341: _low_level_execute_command(): starting 28011 1726882559.64345: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044 `" && echo ansible-tmp-1726882559.6432917-29343-178212183428044="` echo /root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044 `" ) && sleep 0' 28011 1726882559.64776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882559.64779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882559.64781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882559.64783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882559.64785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882559.64838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882559.64841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882559.64885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882559.66742: stdout chunk (state=3): >>>ansible-tmp-1726882559.6432917-29343-178212183428044=/root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044 <<< 28011 1726882559.66848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882559.66871: stderr chunk (state=3): >>><<< 28011 1726882559.66874: stdout chunk (state=3): >>><<< 28011 1726882559.66887: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882559.6432917-29343-178212183428044=/root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882559.66914: variable 'ansible_module_compression' from source: unknown 28011 1726882559.66952: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28011 1726882559.67007: variable 'ansible_facts' from source: unknown 28011 1726882559.67142: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044/AnsiballZ_systemd.py 28011 1726882559.67238: Sending initial data 28011 1726882559.67241: Sent initial data (156 bytes) 28011 1726882559.67678: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882559.67682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882559.67684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882559.67686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882559.67690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882559.67743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882559.67748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882559.67791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882559.69301: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28011 1726882559.69309: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882559.69339: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882559.69380: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpytqdbmk1 /root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044/AnsiballZ_systemd.py <<< 28011 1726882559.69388: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044/AnsiballZ_systemd.py" <<< 28011 1726882559.69425: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpytqdbmk1" to remote "/root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044/AnsiballZ_systemd.py" <<< 28011 1726882559.70474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882559.70511: stderr chunk (state=3): >>><<< 28011 1726882559.70515: stdout chunk (state=3): >>><<< 28011 1726882559.70550: done transferring module to remote 28011 1726882559.70557: _low_level_execute_command(): starting 28011 1726882559.70561: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044/ /root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044/AnsiballZ_systemd.py && sleep 0' 28011 1726882559.70976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882559.70979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882559.70982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882559.70984: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882559.70988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882559.71039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882559.71043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882559.71091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882559.72804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882559.72825: stderr chunk (state=3): >>><<< 28011 1726882559.72828: stdout chunk (state=3): >>><<< 28011 1726882559.72838: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882559.72841: _low_level_execute_command(): starting 28011 1726882559.72846: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044/AnsiballZ_systemd.py && sleep 0' 28011 1726882559.73273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882559.73276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882559.73280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882559.73290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882559.73294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882559.73336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882559.73343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882559.73344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882559.73389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882560.02344: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10833920", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303096320", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1634763000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28011 1726882560.04064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882560.04068: stdout chunk (state=3): >>><<< 28011 1726882560.04074: stderr chunk (state=3): >>><<< 28011 1726882560.04094: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10833920", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303096320", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1634763000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882560.04399: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882560.04510: _low_level_execute_command(): starting 28011 1726882560.04520: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882559.6432917-29343-178212183428044/ > /dev/null 2>&1 && sleep 0' 28011 1726882560.05858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882560.06116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882560.06189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882560.08043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882560.08047: stdout chunk (state=3): >>><<< 28011 1726882560.08050: stderr chunk (state=3): >>><<< 28011 1726882560.08065: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882560.08076: handler run complete 28011 1726882560.08147: attempt loop complete, returning result 28011 1726882560.08155: _execute() done 28011 1726882560.08162: dumping result to json 28011 1726882560.08184: done dumping result, returning 28011 1726882560.08500: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-962d-7c65-0000000000c3] 28011 1726882560.08503: sending task result for task 12673a56-9f93-962d-7c65-0000000000c3 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882560.08762: no more pending results, returning what we have 28011 1726882560.08765: results queue empty 28011 1726882560.08765: checking for any_errors_fatal 28011 1726882560.08771: done checking for any_errors_fatal 28011 1726882560.08771: checking for max_fail_percentage 28011 1726882560.08773: done checking for max_fail_percentage 28011 1726882560.08774: checking to see if all hosts have failed and the running result is not ok 28011 1726882560.08775: done checking to see if all hosts have failed 28011 1726882560.08775: getting the remaining hosts for this loop 28011 1726882560.08777: done getting the remaining hosts for this loop 28011 1726882560.08780: getting the next task for host managed_node1 28011 1726882560.08785: done getting next task for host managed_node1 28011 1726882560.08788: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28011 1726882560.08789: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882560.08804: getting variables 28011 1726882560.08806: in VariableManager get_vars() 28011 1726882560.08837: Calling all_inventory to load vars for managed_node1 28011 1726882560.08839: Calling groups_inventory to load vars for managed_node1 28011 1726882560.08841: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882560.08851: Calling all_plugins_play to load vars for managed_node1 28011 1726882560.08853: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882560.08856: Calling groups_plugins_play to load vars for managed_node1 28011 1726882560.09819: done sending task result for task 12673a56-9f93-962d-7c65-0000000000c3 28011 1726882560.09822: WORKER PROCESS EXITING 28011 1726882560.11458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882560.14761: done with get_vars() 28011 1726882560.14783: done getting variables 28011 1726882560.14854: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:00 -0400 (0:00:00.605) 0:00:29.700 ****** 28011 1726882560.14885: entering _queue_task() for managed_node1/service 28011 1726882560.15245: worker is 1 (out of 1 available) 28011 1726882560.15265: exiting _queue_task() for managed_node1/service 28011 1726882560.15278: done queuing things up, now waiting for results queue to drain 28011 1726882560.15281: waiting for pending results... 28011 1726882560.15580: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28011 1726882560.15672: in run() - task 12673a56-9f93-962d-7c65-0000000000c4 28011 1726882560.15685: variable 'ansible_search_path' from source: unknown 28011 1726882560.15691: variable 'ansible_search_path' from source: unknown 28011 1726882560.15738: calling self._execute() 28011 1726882560.15836: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882560.15840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882560.15852: variable 'omit' from source: magic vars 28011 1726882560.16261: variable 'ansible_distribution_major_version' from source: facts 28011 1726882560.16273: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882560.16663: variable 'network_provider' from source: set_fact 28011 1726882560.16667: Evaluated conditional (network_provider == "nm"): True 28011 1726882560.16768: variable '__network_wpa_supplicant_required' from source: role '' defaults 28011 1726882560.16861: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28011 1726882560.17201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882560.19171: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882560.19240: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882560.19276: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882560.19317: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882560.19344: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882560.19440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882560.19469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882560.19495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882560.19654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882560.19665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882560.19962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882560.19985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882560.20010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882560.20174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882560.20191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882560.20250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882560.20390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882560.20417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882560.20522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882560.20898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882560.20901: variable 'network_connections' from source: play vars 28011 1726882560.20904: variable 'profile' from source: play vars 28011 1726882560.20955: variable 'profile' from source: play vars 28011 1726882560.20959: variable 'interface' from source: set_fact 28011 1726882560.21398: variable 'interface' from source: set_fact 28011 1726882560.21402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882560.21707: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882560.21749: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882560.21860: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882560.21892: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882560.21995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882560.22019: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882560.22044: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882560.22183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882560.22229: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882560.22761: variable 'network_connections' from source: play vars 28011 1726882560.22765: variable 'profile' from source: play vars 28011 1726882560.22945: variable 'profile' from source: play vars 28011 1726882560.23002: variable 'interface' from source: set_fact 28011 1726882560.23128: variable 'interface' from source: set_fact 28011 1726882560.23192: Evaluated conditional (__network_wpa_supplicant_required): False 28011 1726882560.23204: when evaluation is False, skipping this task 28011 1726882560.23207: _execute() done 28011 1726882560.23219: dumping result to json 28011 1726882560.23221: done dumping result, returning 28011 1726882560.23224: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-962d-7c65-0000000000c4] 28011 1726882560.23226: sending task result for task 12673a56-9f93-962d-7c65-0000000000c4 28011 1726882560.23329: done sending task result for task 12673a56-9f93-962d-7c65-0000000000c4 28011 1726882560.23333: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28011 1726882560.23390: no more pending results, returning what we have 28011 1726882560.23395: results queue empty 28011 1726882560.23397: checking for any_errors_fatal 28011 1726882560.23422: done checking for any_errors_fatal 28011 1726882560.23423: checking for max_fail_percentage 28011 1726882560.23424: done checking for max_fail_percentage 28011 1726882560.23425: checking to see if all hosts have failed and the running result is not ok 28011 1726882560.23426: done checking to see if all hosts have failed 28011 1726882560.23427: getting the remaining hosts for this loop 28011 1726882560.23429: done getting the remaining hosts for this loop 28011 1726882560.23433: getting the next task for host managed_node1 28011 1726882560.23439: done getting next task for host managed_node1 28011 1726882560.23442: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28011 1726882560.23444: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882560.23458: getting variables 28011 1726882560.23460: in VariableManager get_vars() 28011 1726882560.23701: Calling all_inventory to load vars for managed_node1 28011 1726882560.23704: Calling groups_inventory to load vars for managed_node1 28011 1726882560.23707: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882560.23723: Calling all_plugins_play to load vars for managed_node1 28011 1726882560.23726: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882560.23729: Calling groups_plugins_play to load vars for managed_node1 28011 1726882560.27027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882560.29669: done with get_vars() 28011 1726882560.29704: done getting variables 28011 1726882560.29766: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:00 -0400 (0:00:00.149) 0:00:29.849 ****** 28011 1726882560.29805: entering _queue_task() for managed_node1/service 28011 1726882560.30180: worker is 1 (out of 1 available) 28011 1726882560.30198: exiting _queue_task() for managed_node1/service 28011 1726882560.30210: done queuing things up, now waiting for results queue to drain 28011 1726882560.30212: waiting for pending results... 28011 1726882560.30872: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 28011 1726882560.30927: in run() - task 12673a56-9f93-962d-7c65-0000000000c5 28011 1726882560.30941: variable 'ansible_search_path' from source: unknown 28011 1726882560.30945: variable 'ansible_search_path' from source: unknown 28011 1726882560.31099: calling self._execute() 28011 1726882560.31197: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882560.31311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882560.31322: variable 'omit' from source: magic vars 28011 1726882560.31884: variable 'ansible_distribution_major_version' from source: facts 28011 1726882560.31903: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882560.32030: variable 'network_provider' from source: set_fact 28011 1726882560.32035: Evaluated conditional (network_provider == "initscripts"): False 28011 1726882560.32039: when evaluation is False, skipping this task 28011 1726882560.32042: _execute() done 28011 1726882560.32044: dumping result to json 28011 1726882560.32046: done dumping result, returning 28011 1726882560.32056: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-962d-7c65-0000000000c5] 28011 1726882560.32062: sending task result for task 12673a56-9f93-962d-7c65-0000000000c5 28011 1726882560.32156: done sending task result for task 12673a56-9f93-962d-7c65-0000000000c5 28011 1726882560.32160: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882560.32210: no more pending results, returning what we have 28011 1726882560.32213: results queue empty 28011 1726882560.32214: checking for any_errors_fatal 28011 1726882560.32225: done checking for any_errors_fatal 28011 1726882560.32225: checking for max_fail_percentage 28011 1726882560.32227: done checking for max_fail_percentage 28011 1726882560.32228: checking to see if all hosts have failed and the running result is not ok 28011 1726882560.32229: done checking to see if all hosts have failed 28011 1726882560.32230: getting the remaining hosts for this loop 28011 1726882560.32231: done getting the remaining hosts for this loop 28011 1726882560.32235: getting the next task for host managed_node1 28011 1726882560.32241: done getting next task for host managed_node1 28011 1726882560.32245: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28011 1726882560.32248: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882560.32263: getting variables 28011 1726882560.32265: in VariableManager get_vars() 28011 1726882560.32307: Calling all_inventory to load vars for managed_node1 28011 1726882560.32310: Calling groups_inventory to load vars for managed_node1 28011 1726882560.32313: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882560.32325: Calling all_plugins_play to load vars for managed_node1 28011 1726882560.32327: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882560.32330: Calling groups_plugins_play to load vars for managed_node1 28011 1726882560.34138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882560.35769: done with get_vars() 28011 1726882560.35792: done getting variables 28011 1726882560.35853: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:00 -0400 (0:00:00.060) 0:00:29.910 ****** 28011 1726882560.35894: entering _queue_task() for managed_node1/copy 28011 1726882560.36322: worker is 1 (out of 1 available) 28011 1726882560.36333: exiting _queue_task() for managed_node1/copy 28011 1726882560.36344: done queuing things up, now waiting for results queue to drain 28011 1726882560.36346: waiting for pending results... 28011 1726882560.36575: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28011 1726882560.36670: in run() - task 12673a56-9f93-962d-7c65-0000000000c6 28011 1726882560.36674: variable 'ansible_search_path' from source: unknown 28011 1726882560.36677: variable 'ansible_search_path' from source: unknown 28011 1726882560.36777: calling self._execute() 28011 1726882560.36796: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882560.36807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882560.36820: variable 'omit' from source: magic vars 28011 1726882560.37226: variable 'ansible_distribution_major_version' from source: facts 28011 1726882560.37245: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882560.37356: variable 'network_provider' from source: set_fact 28011 1726882560.37366: Evaluated conditional (network_provider == "initscripts"): False 28011 1726882560.37372: when evaluation is False, skipping this task 28011 1726882560.37377: _execute() done 28011 1726882560.37382: dumping result to json 28011 1726882560.37397: done dumping result, returning 28011 1726882560.37408: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-962d-7c65-0000000000c6] 28011 1726882560.37417: sending task result for task 12673a56-9f93-962d-7c65-0000000000c6 skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28011 1726882560.37659: no more pending results, returning what we have 28011 1726882560.37663: results queue empty 28011 1726882560.37664: checking for any_errors_fatal 28011 1726882560.37671: done checking for any_errors_fatal 28011 1726882560.37672: checking for max_fail_percentage 28011 1726882560.37674: done checking for max_fail_percentage 28011 1726882560.37675: checking to see if all hosts have failed and the running result is not ok 28011 1726882560.37676: done checking to see if all hosts have failed 28011 1726882560.37677: getting the remaining hosts for this loop 28011 1726882560.37678: done getting the remaining hosts for this loop 28011 1726882560.37682: getting the next task for host managed_node1 28011 1726882560.37690: done getting next task for host managed_node1 28011 1726882560.37696: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28011 1726882560.37699: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882560.37717: done sending task result for task 12673a56-9f93-962d-7c65-0000000000c6 28011 1726882560.37720: WORKER PROCESS EXITING 28011 1726882560.37899: getting variables 28011 1726882560.37901: in VariableManager get_vars() 28011 1726882560.37933: Calling all_inventory to load vars for managed_node1 28011 1726882560.37935: Calling groups_inventory to load vars for managed_node1 28011 1726882560.37938: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882560.37946: Calling all_plugins_play to load vars for managed_node1 28011 1726882560.37948: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882560.37951: Calling groups_plugins_play to load vars for managed_node1 28011 1726882560.39316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882560.41021: done with get_vars() 28011 1726882560.41053: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:00 -0400 (0:00:00.052) 0:00:29.962 ****** 28011 1726882560.41149: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 28011 1726882560.41633: worker is 1 (out of 1 available) 28011 1726882560.41644: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 28011 1726882560.41656: done queuing things up, now waiting for results queue to drain 28011 1726882560.41658: waiting for pending results... 28011 1726882560.41900: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28011 1726882560.41969: in run() - task 12673a56-9f93-962d-7c65-0000000000c7 28011 1726882560.42001: variable 'ansible_search_path' from source: unknown 28011 1726882560.42011: variable 'ansible_search_path' from source: unknown 28011 1726882560.42053: calling self._execute() 28011 1726882560.42163: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882560.42209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882560.42213: variable 'omit' from source: magic vars 28011 1726882560.42607: variable 'ansible_distribution_major_version' from source: facts 28011 1726882560.42625: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882560.42642: variable 'omit' from source: magic vars 28011 1726882560.42751: variable 'omit' from source: magic vars 28011 1726882560.42869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882560.45527: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882560.45607: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882560.45652: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882560.45700: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882560.45733: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882560.45827: variable 'network_provider' from source: set_fact 28011 1726882560.45966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882560.46101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882560.46104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882560.46108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882560.46110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882560.46173: variable 'omit' from source: magic vars 28011 1726882560.46303: variable 'omit' from source: magic vars 28011 1726882560.46434: variable 'network_connections' from source: play vars 28011 1726882560.46456: variable 'profile' from source: play vars 28011 1726882560.46531: variable 'profile' from source: play vars 28011 1726882560.46555: variable 'interface' from source: set_fact 28011 1726882560.46613: variable 'interface' from source: set_fact 28011 1726882560.46798: variable 'omit' from source: magic vars 28011 1726882560.46802: variable '__lsr_ansible_managed' from source: task vars 28011 1726882560.46853: variable '__lsr_ansible_managed' from source: task vars 28011 1726882560.47166: Loaded config def from plugin (lookup/template) 28011 1726882560.47177: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28011 1726882560.47296: File lookup term: get_ansible_managed.j2 28011 1726882560.47300: variable 'ansible_search_path' from source: unknown 28011 1726882560.47303: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28011 1726882560.47307: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28011 1726882560.47309: variable 'ansible_search_path' from source: unknown 28011 1726882560.59504: variable 'ansible_managed' from source: unknown 28011 1726882560.59680: variable 'omit' from source: magic vars 28011 1726882560.59761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882560.59948: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882560.59951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882560.59955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882560.59969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882560.60004: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882560.60060: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882560.60067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882560.60154: Set connection var ansible_connection to ssh 28011 1726882560.60297: Set connection var ansible_pipelining to False 28011 1726882560.60300: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882560.60304: Set connection var ansible_shell_executable to /bin/sh 28011 1726882560.60315: Set connection var ansible_timeout to 10 28011 1726882560.60322: Set connection var ansible_shell_type to sh 28011 1726882560.60347: variable 'ansible_shell_executable' from source: unknown 28011 1726882560.60383: variable 'ansible_connection' from source: unknown 28011 1726882560.60392: variable 'ansible_module_compression' from source: unknown 28011 1726882560.60400: variable 'ansible_shell_type' from source: unknown 28011 1726882560.60406: variable 'ansible_shell_executable' from source: unknown 28011 1726882560.60412: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882560.60419: variable 'ansible_pipelining' from source: unknown 28011 1726882560.60425: variable 'ansible_timeout' from source: unknown 28011 1726882560.60431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882560.60899: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882560.60910: variable 'omit' from source: magic vars 28011 1726882560.60913: starting attempt loop 28011 1726882560.60915: running the handler 28011 1726882560.60919: _low_level_execute_command(): starting 28011 1726882560.60922: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882560.62181: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882560.62203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882560.62218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882560.62461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882560.62475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882560.62564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882560.64417: stdout chunk (state=3): >>>/root <<< 28011 1726882560.64433: stdout chunk (state=3): >>><<< 28011 1726882560.64440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882560.64449: stderr chunk (state=3): >>><<< 28011 1726882560.64472: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882560.64492: _low_level_execute_command(): starting 28011 1726882560.64524: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941 `" && echo ansible-tmp-1726882560.6447797-29378-125055912560941="` echo /root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941 `" ) && sleep 0' 28011 1726882560.65498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882560.65515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882560.65536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882560.65569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882560.65600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882560.65643: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882560.65716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882560.65750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882560.66098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882560.66147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882560.68018: stdout chunk (state=3): >>>ansible-tmp-1726882560.6447797-29378-125055912560941=/root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941 <<< 28011 1726882560.68224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882560.68227: stdout chunk (state=3): >>><<< 28011 1726882560.68230: stderr chunk (state=3): >>><<< 28011 1726882560.68232: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882560.6447797-29378-125055912560941=/root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882560.68441: variable 'ansible_module_compression' from source: unknown 28011 1726882560.68445: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28011 1726882560.68447: variable 'ansible_facts' from source: unknown 28011 1726882560.68654: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941/AnsiballZ_network_connections.py 28011 1726882560.69185: Sending initial data 28011 1726882560.69192: Sent initial data (168 bytes) 28011 1726882560.69677: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882560.69692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882560.69710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882560.69743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882560.69847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882560.70199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882560.70241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882560.71770: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882560.71816: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882560.71858: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpqvfcvphq /root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941/AnsiballZ_network_connections.py <<< 28011 1726882560.71869: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941/AnsiballZ_network_connections.py" <<< 28011 1726882560.71907: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpqvfcvphq" to remote "/root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941/AnsiballZ_network_connections.py" <<< 28011 1726882560.73647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882560.73870: stderr chunk (state=3): >>><<< 28011 1726882560.73873: stdout chunk (state=3): >>><<< 28011 1726882560.73876: done transferring module to remote 28011 1726882560.73878: _low_level_execute_command(): starting 28011 1726882560.73880: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941/ /root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941/AnsiballZ_network_connections.py && sleep 0' 28011 1726882560.74923: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882560.74966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882560.74969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882560.74972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882560.74974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882560.74976: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882560.75092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882560.75160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882560.75180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882560.75322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882560.77101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882560.77105: stdout chunk (state=3): >>><<< 28011 1726882560.77111: stderr chunk (state=3): >>><<< 28011 1726882560.77227: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882560.77231: _low_level_execute_command(): starting 28011 1726882560.77233: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941/AnsiballZ_network_connections.py && sleep 0' 28011 1726882560.78614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882560.78710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882560.78898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882560.78902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882561.06935: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28011 1726882561.08719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882561.08727: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 28011 1726882561.08781: stderr chunk (state=3): >>><<< 28011 1726882561.08784: stdout chunk (state=3): >>><<< 28011 1726882561.08812: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882561.08848: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882561.08858: _low_level_execute_command(): starting 28011 1726882561.08863: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882560.6447797-29378-125055912560941/ > /dev/null 2>&1 && sleep 0' 28011 1726882561.09440: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882561.09448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882561.09459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882561.09472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882561.09483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882561.09491: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882561.09595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882561.09602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882561.09604: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882561.09606: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28011 1726882561.09608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882561.09611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882561.09614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882561.09616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882561.09618: stderr chunk (state=3): >>>debug2: match found <<< 28011 1726882561.09620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882561.09710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882561.09713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882561.09735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882561.09815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882561.11644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882561.11900: stderr chunk (state=3): >>><<< 28011 1726882561.11903: stdout chunk (state=3): >>><<< 28011 1726882561.11906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882561.11908: handler run complete 28011 1726882561.11910: attempt loop complete, returning result 28011 1726882561.11912: _execute() done 28011 1726882561.11914: dumping result to json 28011 1726882561.11915: done dumping result, returning 28011 1726882561.11917: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-962d-7c65-0000000000c7] 28011 1726882561.11922: sending task result for task 12673a56-9f93-962d-7c65-0000000000c7 28011 1726882561.11984: done sending task result for task 12673a56-9f93-962d-7c65-0000000000c7 28011 1726882561.11989: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 28011 1726882561.12078: no more pending results, returning what we have 28011 1726882561.12081: results queue empty 28011 1726882561.12082: checking for any_errors_fatal 28011 1726882561.12089: done checking for any_errors_fatal 28011 1726882561.12090: checking for max_fail_percentage 28011 1726882561.12091: done checking for max_fail_percentage 28011 1726882561.12092: checking to see if all hosts have failed and the running result is not ok 28011 1726882561.12197: done checking to see if all hosts have failed 28011 1726882561.12198: getting the remaining hosts for this loop 28011 1726882561.12200: done getting the remaining hosts for this loop 28011 1726882561.12204: getting the next task for host managed_node1 28011 1726882561.12208: done getting next task for host managed_node1 28011 1726882561.12211: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28011 1726882561.12213: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882561.12223: getting variables 28011 1726882561.12224: in VariableManager get_vars() 28011 1726882561.12258: Calling all_inventory to load vars for managed_node1 28011 1726882561.12261: Calling groups_inventory to load vars for managed_node1 28011 1726882561.12263: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882561.12272: Calling all_plugins_play to load vars for managed_node1 28011 1726882561.12275: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882561.12277: Calling groups_plugins_play to load vars for managed_node1 28011 1726882561.13807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882561.15502: done with get_vars() 28011 1726882561.15528: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:36:01 -0400 (0:00:00.746) 0:00:30.709 ****** 28011 1726882561.15816: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 28011 1726882561.16370: worker is 1 (out of 1 available) 28011 1726882561.16382: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 28011 1726882561.16550: done queuing things up, now waiting for results queue to drain 28011 1726882561.16553: waiting for pending results... 28011 1726882561.17038: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 28011 1726882561.17151: in run() - task 12673a56-9f93-962d-7c65-0000000000c8 28011 1726882561.17164: variable 'ansible_search_path' from source: unknown 28011 1726882561.17168: variable 'ansible_search_path' from source: unknown 28011 1726882561.17400: calling self._execute() 28011 1726882561.17404: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882561.17407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882561.17410: variable 'omit' from source: magic vars 28011 1726882561.17725: variable 'ansible_distribution_major_version' from source: facts 28011 1726882561.17737: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882561.17961: variable 'network_state' from source: role '' defaults 28011 1726882561.17964: Evaluated conditional (network_state != {}): False 28011 1726882561.17966: when evaluation is False, skipping this task 28011 1726882561.17968: _execute() done 28011 1726882561.17970: dumping result to json 28011 1726882561.17971: done dumping result, returning 28011 1726882561.17973: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-962d-7c65-0000000000c8] 28011 1726882561.17975: sending task result for task 12673a56-9f93-962d-7c65-0000000000c8 28011 1726882561.18037: done sending task result for task 12673a56-9f93-962d-7c65-0000000000c8 28011 1726882561.18040: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882561.18114: no more pending results, returning what we have 28011 1726882561.18118: results queue empty 28011 1726882561.18119: checking for any_errors_fatal 28011 1726882561.18131: done checking for any_errors_fatal 28011 1726882561.18131: checking for max_fail_percentage 28011 1726882561.18133: done checking for max_fail_percentage 28011 1726882561.18134: checking to see if all hosts have failed and the running result is not ok 28011 1726882561.18135: done checking to see if all hosts have failed 28011 1726882561.18135: getting the remaining hosts for this loop 28011 1726882561.18137: done getting the remaining hosts for this loop 28011 1726882561.18140: getting the next task for host managed_node1 28011 1726882561.18144: done getting next task for host managed_node1 28011 1726882561.18148: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28011 1726882561.18150: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882561.18163: getting variables 28011 1726882561.18164: in VariableManager get_vars() 28011 1726882561.18198: Calling all_inventory to load vars for managed_node1 28011 1726882561.18200: Calling groups_inventory to load vars for managed_node1 28011 1726882561.18202: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882561.18212: Calling all_plugins_play to load vars for managed_node1 28011 1726882561.18214: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882561.18217: Calling groups_plugins_play to load vars for managed_node1 28011 1726882561.20254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882561.21868: done with get_vars() 28011 1726882561.21898: done getting variables 28011 1726882561.21958: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:36:01 -0400 (0:00:00.061) 0:00:30.771 ****** 28011 1726882561.21987: entering _queue_task() for managed_node1/debug 28011 1726882561.22345: worker is 1 (out of 1 available) 28011 1726882561.22358: exiting _queue_task() for managed_node1/debug 28011 1726882561.22371: done queuing things up, now waiting for results queue to drain 28011 1726882561.22372: waiting for pending results... 28011 1726882561.22645: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28011 1726882561.22800: in run() - task 12673a56-9f93-962d-7c65-0000000000c9 28011 1726882561.22803: variable 'ansible_search_path' from source: unknown 28011 1726882561.22806: variable 'ansible_search_path' from source: unknown 28011 1726882561.22822: calling self._execute() 28011 1726882561.23201: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882561.23205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882561.23208: variable 'omit' from source: magic vars 28011 1726882561.23692: variable 'ansible_distribution_major_version' from source: facts 28011 1726882561.23898: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882561.23902: variable 'omit' from source: magic vars 28011 1726882561.23904: variable 'omit' from source: magic vars 28011 1726882561.24016: variable 'omit' from source: magic vars 28011 1726882561.24064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882561.24103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882561.24163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882561.24298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882561.24302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882561.24325: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882561.24335: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882561.24364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882561.24579: Set connection var ansible_connection to ssh 28011 1726882561.24592: Set connection var ansible_pipelining to False 28011 1726882561.24604: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882561.24695: Set connection var ansible_shell_executable to /bin/sh 28011 1726882561.24795: Set connection var ansible_timeout to 10 28011 1726882561.24799: Set connection var ansible_shell_type to sh 28011 1726882561.24801: variable 'ansible_shell_executable' from source: unknown 28011 1726882561.24803: variable 'ansible_connection' from source: unknown 28011 1726882561.24806: variable 'ansible_module_compression' from source: unknown 28011 1726882561.24808: variable 'ansible_shell_type' from source: unknown 28011 1726882561.24809: variable 'ansible_shell_executable' from source: unknown 28011 1726882561.24812: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882561.24814: variable 'ansible_pipelining' from source: unknown 28011 1726882561.24816: variable 'ansible_timeout' from source: unknown 28011 1726882561.24818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882561.24994: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882561.25199: variable 'omit' from source: magic vars 28011 1726882561.25202: starting attempt loop 28011 1726882561.25204: running the handler 28011 1726882561.25414: variable '__network_connections_result' from source: set_fact 28011 1726882561.25542: handler run complete 28011 1726882561.25770: attempt loop complete, returning result 28011 1726882561.25774: _execute() done 28011 1726882561.25776: dumping result to json 28011 1726882561.25778: done dumping result, returning 28011 1726882561.25781: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-962d-7c65-0000000000c9] 28011 1726882561.25783: sending task result for task 12673a56-9f93-962d-7c65-0000000000c9 28011 1726882561.25852: done sending task result for task 12673a56-9f93-962d-7c65-0000000000c9 28011 1726882561.25856: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 28011 1726882561.25937: no more pending results, returning what we have 28011 1726882561.25941: results queue empty 28011 1726882561.25942: checking for any_errors_fatal 28011 1726882561.25951: done checking for any_errors_fatal 28011 1726882561.25952: checking for max_fail_percentage 28011 1726882561.25954: done checking for max_fail_percentage 28011 1726882561.25955: checking to see if all hosts have failed and the running result is not ok 28011 1726882561.25956: done checking to see if all hosts have failed 28011 1726882561.25957: getting the remaining hosts for this loop 28011 1726882561.25958: done getting the remaining hosts for this loop 28011 1726882561.25963: getting the next task for host managed_node1 28011 1726882561.25969: done getting next task for host managed_node1 28011 1726882561.25974: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28011 1726882561.25976: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882561.25987: getting variables 28011 1726882561.25989: in VariableManager get_vars() 28011 1726882561.26029: Calling all_inventory to load vars for managed_node1 28011 1726882561.26031: Calling groups_inventory to load vars for managed_node1 28011 1726882561.26033: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882561.26044: Calling all_plugins_play to load vars for managed_node1 28011 1726882561.26046: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882561.26048: Calling groups_plugins_play to load vars for managed_node1 28011 1726882561.29410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882561.32815: done with get_vars() 28011 1726882561.32848: done getting variables 28011 1726882561.33119: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:36:01 -0400 (0:00:00.111) 0:00:30.882 ****** 28011 1726882561.33150: entering _queue_task() for managed_node1/debug 28011 1726882561.33835: worker is 1 (out of 1 available) 28011 1726882561.33847: exiting _queue_task() for managed_node1/debug 28011 1726882561.33861: done queuing things up, now waiting for results queue to drain 28011 1726882561.33863: waiting for pending results... 28011 1726882561.34444: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28011 1726882561.34892: in run() - task 12673a56-9f93-962d-7c65-0000000000ca 28011 1726882561.34898: variable 'ansible_search_path' from source: unknown 28011 1726882561.34901: variable 'ansible_search_path' from source: unknown 28011 1726882561.34951: calling self._execute() 28011 1726882561.35501: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882561.35506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882561.35509: variable 'omit' from source: magic vars 28011 1726882561.36310: variable 'ansible_distribution_major_version' from source: facts 28011 1726882561.36323: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882561.36330: variable 'omit' from source: magic vars 28011 1726882561.36381: variable 'omit' from source: magic vars 28011 1726882561.36421: variable 'omit' from source: magic vars 28011 1726882561.36670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882561.36679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882561.36707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882561.36724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882561.36738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882561.36884: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882561.36888: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882561.36905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882561.37120: Set connection var ansible_connection to ssh 28011 1726882561.37124: Set connection var ansible_pipelining to False 28011 1726882561.37126: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882561.37128: Set connection var ansible_shell_executable to /bin/sh 28011 1726882561.37207: Set connection var ansible_timeout to 10 28011 1726882561.37210: Set connection var ansible_shell_type to sh 28011 1726882561.37212: variable 'ansible_shell_executable' from source: unknown 28011 1726882561.37215: variable 'ansible_connection' from source: unknown 28011 1726882561.37217: variable 'ansible_module_compression' from source: unknown 28011 1726882561.37219: variable 'ansible_shell_type' from source: unknown 28011 1726882561.37221: variable 'ansible_shell_executable' from source: unknown 28011 1726882561.37223: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882561.37225: variable 'ansible_pipelining' from source: unknown 28011 1726882561.37227: variable 'ansible_timeout' from source: unknown 28011 1726882561.37229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882561.37562: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882561.37574: variable 'omit' from source: magic vars 28011 1726882561.37579: starting attempt loop 28011 1726882561.37583: running the handler 28011 1726882561.37861: variable '__network_connections_result' from source: set_fact 28011 1726882561.37952: variable '__network_connections_result' from source: set_fact 28011 1726882561.38096: handler run complete 28011 1726882561.38240: attempt loop complete, returning result 28011 1726882561.38304: _execute() done 28011 1726882561.38313: dumping result to json 28011 1726882561.38328: done dumping result, returning 28011 1726882561.38341: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-962d-7c65-0000000000ca] 28011 1726882561.38352: sending task result for task 12673a56-9f93-962d-7c65-0000000000ca 28011 1726882561.38669: done sending task result for task 12673a56-9f93-962d-7c65-0000000000ca 28011 1726882561.38673: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 28011 1726882561.38759: no more pending results, returning what we have 28011 1726882561.38762: results queue empty 28011 1726882561.38763: checking for any_errors_fatal 28011 1726882561.38770: done checking for any_errors_fatal 28011 1726882561.38770: checking for max_fail_percentage 28011 1726882561.38772: done checking for max_fail_percentage 28011 1726882561.38772: checking to see if all hosts have failed and the running result is not ok 28011 1726882561.38773: done checking to see if all hosts have failed 28011 1726882561.38774: getting the remaining hosts for this loop 28011 1726882561.38775: done getting the remaining hosts for this loop 28011 1726882561.38778: getting the next task for host managed_node1 28011 1726882561.38783: done getting next task for host managed_node1 28011 1726882561.38787: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28011 1726882561.38789: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882561.38801: getting variables 28011 1726882561.38802: in VariableManager get_vars() 28011 1726882561.38837: Calling all_inventory to load vars for managed_node1 28011 1726882561.38839: Calling groups_inventory to load vars for managed_node1 28011 1726882561.38841: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882561.38851: Calling all_plugins_play to load vars for managed_node1 28011 1726882561.38854: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882561.38856: Calling groups_plugins_play to load vars for managed_node1 28011 1726882561.41729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882561.45280: done with get_vars() 28011 1726882561.45623: done getting variables 28011 1726882561.45683: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:01 -0400 (0:00:00.125) 0:00:31.008 ****** 28011 1726882561.45821: entering _queue_task() for managed_node1/debug 28011 1726882561.46247: worker is 1 (out of 1 available) 28011 1726882561.46262: exiting _queue_task() for managed_node1/debug 28011 1726882561.46389: done queuing things up, now waiting for results queue to drain 28011 1726882561.46391: waiting for pending results... 28011 1726882561.46616: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28011 1726882561.46700: in run() - task 12673a56-9f93-962d-7c65-0000000000cb 28011 1726882561.46723: variable 'ansible_search_path' from source: unknown 28011 1726882561.46730: variable 'ansible_search_path' from source: unknown 28011 1726882561.46769: calling self._execute() 28011 1726882561.46879: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882561.46896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882561.46912: variable 'omit' from source: magic vars 28011 1726882561.47313: variable 'ansible_distribution_major_version' from source: facts 28011 1726882561.47331: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882561.47469: variable 'network_state' from source: role '' defaults 28011 1726882561.47492: Evaluated conditional (network_state != {}): False 28011 1726882561.47582: when evaluation is False, skipping this task 28011 1726882561.47585: _execute() done 28011 1726882561.47591: dumping result to json 28011 1726882561.47596: done dumping result, returning 28011 1726882561.47600: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-962d-7c65-0000000000cb] 28011 1726882561.47602: sending task result for task 12673a56-9f93-962d-7c65-0000000000cb 28011 1726882561.47797: done sending task result for task 12673a56-9f93-962d-7c65-0000000000cb skipping: [managed_node1] => { "false_condition": "network_state != {}" } 28011 1726882561.47846: no more pending results, returning what we have 28011 1726882561.47850: results queue empty 28011 1726882561.47850: checking for any_errors_fatal 28011 1726882561.47860: done checking for any_errors_fatal 28011 1726882561.47861: checking for max_fail_percentage 28011 1726882561.47863: done checking for max_fail_percentage 28011 1726882561.47864: checking to see if all hosts have failed and the running result is not ok 28011 1726882561.47865: done checking to see if all hosts have failed 28011 1726882561.47866: getting the remaining hosts for this loop 28011 1726882561.47868: done getting the remaining hosts for this loop 28011 1726882561.47872: getting the next task for host managed_node1 28011 1726882561.47878: done getting next task for host managed_node1 28011 1726882561.47882: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28011 1726882561.47885: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882561.47907: getting variables 28011 1726882561.47909: in VariableManager get_vars() 28011 1726882561.47950: Calling all_inventory to load vars for managed_node1 28011 1726882561.47952: Calling groups_inventory to load vars for managed_node1 28011 1726882561.47955: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882561.47966: Calling all_plugins_play to load vars for managed_node1 28011 1726882561.47970: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882561.47972: Calling groups_plugins_play to load vars for managed_node1 28011 1726882561.48542: WORKER PROCESS EXITING 28011 1726882561.50435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882561.52451: done with get_vars() 28011 1726882561.52602: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:01 -0400 (0:00:00.070) 0:00:31.079 ****** 28011 1726882561.52821: entering _queue_task() for managed_node1/ping 28011 1726882561.53680: worker is 1 (out of 1 available) 28011 1726882561.53698: exiting _queue_task() for managed_node1/ping 28011 1726882561.53709: done queuing things up, now waiting for results queue to drain 28011 1726882561.53711: waiting for pending results... 28011 1726882561.54077: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 28011 1726882561.54283: in run() - task 12673a56-9f93-962d-7c65-0000000000cc 28011 1726882561.54456: variable 'ansible_search_path' from source: unknown 28011 1726882561.54460: variable 'ansible_search_path' from source: unknown 28011 1726882561.54502: calling self._execute() 28011 1726882561.54752: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882561.54756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882561.54759: variable 'omit' from source: magic vars 28011 1726882561.55248: variable 'ansible_distribution_major_version' from source: facts 28011 1726882561.55265: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882561.55275: variable 'omit' from source: magic vars 28011 1726882561.55433: variable 'omit' from source: magic vars 28011 1726882561.55436: variable 'omit' from source: magic vars 28011 1726882561.55438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882561.55443: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882561.55468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882561.55489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882561.55506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882561.55550: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882561.55559: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882561.55566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882561.55673: Set connection var ansible_connection to ssh 28011 1726882561.55686: Set connection var ansible_pipelining to False 28011 1726882561.55699: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882561.55709: Set connection var ansible_shell_executable to /bin/sh 28011 1726882561.55720: Set connection var ansible_timeout to 10 28011 1726882561.55730: Set connection var ansible_shell_type to sh 28011 1726882561.55764: variable 'ansible_shell_executable' from source: unknown 28011 1726882561.55772: variable 'ansible_connection' from source: unknown 28011 1726882561.55779: variable 'ansible_module_compression' from source: unknown 28011 1726882561.55785: variable 'ansible_shell_type' from source: unknown 28011 1726882561.55791: variable 'ansible_shell_executable' from source: unknown 28011 1726882561.55799: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882561.55807: variable 'ansible_pipelining' from source: unknown 28011 1726882561.55812: variable 'ansible_timeout' from source: unknown 28011 1726882561.55819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882561.56043: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882561.56061: variable 'omit' from source: magic vars 28011 1726882561.56083: starting attempt loop 28011 1726882561.56087: running the handler 28011 1726882561.56195: _low_level_execute_command(): starting 28011 1726882561.56198: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882561.56997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882561.57070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882561.57123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882561.57270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882561.57285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882561.57479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882561.59163: stdout chunk (state=3): >>>/root <<< 28011 1726882561.59306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882561.59325: stdout chunk (state=3): >>><<< 28011 1726882561.59343: stderr chunk (state=3): >>><<< 28011 1726882561.59373: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882561.59396: _low_level_execute_command(): starting 28011 1726882561.59408: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426 `" && echo ansible-tmp-1726882561.5937998-29421-90507194881426="` echo /root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426 `" ) && sleep 0' 28011 1726882561.60036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882561.60050: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882561.60120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882561.60184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882561.60212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882561.60245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882561.60328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882561.62228: stdout chunk (state=3): >>>ansible-tmp-1726882561.5937998-29421-90507194881426=/root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426 <<< 28011 1726882561.62405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882561.62409: stdout chunk (state=3): >>><<< 28011 1726882561.62411: stderr chunk (state=3): >>><<< 28011 1726882561.62414: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882561.5937998-29421-90507194881426=/root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882561.62424: variable 'ansible_module_compression' from source: unknown 28011 1726882561.62463: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28011 1726882561.62497: variable 'ansible_facts' from source: unknown 28011 1726882561.62566: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426/AnsiballZ_ping.py 28011 1726882561.62765: Sending initial data 28011 1726882561.62769: Sent initial data (152 bytes) 28011 1726882561.63243: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882561.63252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882561.63295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882561.63397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882561.63402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882561.63470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882561.65029: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882561.65104: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882561.65197: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp_8xixvwt /root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426/AnsiballZ_ping.py <<< 28011 1726882561.65200: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426/AnsiballZ_ping.py" <<< 28011 1726882561.65300: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp_8xixvwt" to remote "/root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426/AnsiballZ_ping.py" <<< 28011 1726882561.66348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882561.66359: stdout chunk (state=3): >>><<< 28011 1726882561.66377: stderr chunk (state=3): >>><<< 28011 1726882561.66452: done transferring module to remote 28011 1726882561.66469: _low_level_execute_command(): starting 28011 1726882561.66557: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426/ /root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426/AnsiballZ_ping.py && sleep 0' 28011 1726882561.67145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882561.67208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882561.67279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882561.67311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882561.67390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882561.69167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882561.69191: stdout chunk (state=3): >>><<< 28011 1726882561.69197: stderr chunk (state=3): >>><<< 28011 1726882561.69298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882561.69302: _low_level_execute_command(): starting 28011 1726882561.69304: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426/AnsiballZ_ping.py && sleep 0' 28011 1726882561.69846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882561.69862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882561.69875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882561.69965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882561.70001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882561.70017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882561.70039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882561.70126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882561.84947: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28011 1726882561.86262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882561.86279: stderr chunk (state=3): >>><<< 28011 1726882561.86283: stdout chunk (state=3): >>><<< 28011 1726882561.86301: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882561.86322: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882561.86330: _low_level_execute_command(): starting 28011 1726882561.86334: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882561.5937998-29421-90507194881426/ > /dev/null 2>&1 && sleep 0' 28011 1726882561.86759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882561.86762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882561.86765: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882561.86767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882561.86769: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882561.86825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882561.86828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882561.86867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882561.88898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882561.88902: stdout chunk (state=3): >>><<< 28011 1726882561.88903: stderr chunk (state=3): >>><<< 28011 1726882561.88905: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882561.88911: handler run complete 28011 1726882561.88913: attempt loop complete, returning result 28011 1726882561.88915: _execute() done 28011 1726882561.88916: dumping result to json 28011 1726882561.88917: done dumping result, returning 28011 1726882561.88919: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-962d-7c65-0000000000cc] 28011 1726882561.88921: sending task result for task 12673a56-9f93-962d-7c65-0000000000cc 28011 1726882561.88978: done sending task result for task 12673a56-9f93-962d-7c65-0000000000cc 28011 1726882561.88980: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 28011 1726882561.89038: no more pending results, returning what we have 28011 1726882561.89042: results queue empty 28011 1726882561.89042: checking for any_errors_fatal 28011 1726882561.89047: done checking for any_errors_fatal 28011 1726882561.89048: checking for max_fail_percentage 28011 1726882561.89049: done checking for max_fail_percentage 28011 1726882561.89050: checking to see if all hosts have failed and the running result is not ok 28011 1726882561.89051: done checking to see if all hosts have failed 28011 1726882561.89051: getting the remaining hosts for this loop 28011 1726882561.89053: done getting the remaining hosts for this loop 28011 1726882561.89056: getting the next task for host managed_node1 28011 1726882561.89062: done getting next task for host managed_node1 28011 1726882561.89064: ^ task is: TASK: meta (role_complete) 28011 1726882561.89065: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882561.89073: getting variables 28011 1726882561.89075: in VariableManager get_vars() 28011 1726882561.89109: Calling all_inventory to load vars for managed_node1 28011 1726882561.89112: Calling groups_inventory to load vars for managed_node1 28011 1726882561.89114: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882561.89122: Calling all_plugins_play to load vars for managed_node1 28011 1726882561.89125: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882561.89127: Calling groups_plugins_play to load vars for managed_node1 28011 1726882561.94806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882561.96361: done with get_vars() 28011 1726882561.96387: done getting variables 28011 1726882561.96455: done queuing things up, now waiting for results queue to drain 28011 1726882561.96457: results queue empty 28011 1726882561.96458: checking for any_errors_fatal 28011 1726882561.96461: done checking for any_errors_fatal 28011 1726882561.96461: checking for max_fail_percentage 28011 1726882561.96463: done checking for max_fail_percentage 28011 1726882561.96463: checking to see if all hosts have failed and the running result is not ok 28011 1726882561.96464: done checking to see if all hosts have failed 28011 1726882561.96465: getting the remaining hosts for this loop 28011 1726882561.96466: done getting the remaining hosts for this loop 28011 1726882561.96468: getting the next task for host managed_node1 28011 1726882561.96471: done getting next task for host managed_node1 28011 1726882561.96473: ^ task is: TASK: meta (flush_handlers) 28011 1726882561.96474: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882561.96477: getting variables 28011 1726882561.96478: in VariableManager get_vars() 28011 1726882561.96489: Calling all_inventory to load vars for managed_node1 28011 1726882561.96492: Calling groups_inventory to load vars for managed_node1 28011 1726882561.96496: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882561.96501: Calling all_plugins_play to load vars for managed_node1 28011 1726882561.96503: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882561.96506: Calling groups_plugins_play to load vars for managed_node1 28011 1726882561.97624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882561.99124: done with get_vars() 28011 1726882561.99144: done getting variables 28011 1726882561.99258: in VariableManager get_vars() 28011 1726882561.99272: Calling all_inventory to load vars for managed_node1 28011 1726882561.99274: Calling groups_inventory to load vars for managed_node1 28011 1726882561.99276: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882561.99281: Calling all_plugins_play to load vars for managed_node1 28011 1726882561.99291: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882561.99302: Calling groups_plugins_play to load vars for managed_node1 28011 1726882562.01179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882562.02869: done with get_vars() 28011 1726882562.02900: done queuing things up, now waiting for results queue to drain 28011 1726882562.02902: results queue empty 28011 1726882562.02903: checking for any_errors_fatal 28011 1726882562.02904: done checking for any_errors_fatal 28011 1726882562.02905: checking for max_fail_percentage 28011 1726882562.02906: done checking for max_fail_percentage 28011 1726882562.02907: checking to see if all hosts have failed and the running result is not ok 28011 1726882562.02907: done checking to see if all hosts have failed 28011 1726882562.02908: getting the remaining hosts for this loop 28011 1726882562.02909: done getting the remaining hosts for this loop 28011 1726882562.02912: getting the next task for host managed_node1 28011 1726882562.02916: done getting next task for host managed_node1 28011 1726882562.02917: ^ task is: TASK: meta (flush_handlers) 28011 1726882562.02918: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882562.02921: getting variables 28011 1726882562.02922: in VariableManager get_vars() 28011 1726882562.02932: Calling all_inventory to load vars for managed_node1 28011 1726882562.02934: Calling groups_inventory to load vars for managed_node1 28011 1726882562.02936: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882562.02946: Calling all_plugins_play to load vars for managed_node1 28011 1726882562.02949: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882562.02952: Calling groups_plugins_play to load vars for managed_node1 28011 1726882562.04173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882562.05913: done with get_vars() 28011 1726882562.05937: done getting variables 28011 1726882562.05989: in VariableManager get_vars() 28011 1726882562.06006: Calling all_inventory to load vars for managed_node1 28011 1726882562.06009: Calling groups_inventory to load vars for managed_node1 28011 1726882562.06011: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882562.06016: Calling all_plugins_play to load vars for managed_node1 28011 1726882562.06019: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882562.06021: Calling groups_plugins_play to load vars for managed_node1 28011 1726882562.07291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882562.08989: done with get_vars() 28011 1726882562.09027: done queuing things up, now waiting for results queue to drain 28011 1726882562.09030: results queue empty 28011 1726882562.09031: checking for any_errors_fatal 28011 1726882562.09032: done checking for any_errors_fatal 28011 1726882562.09033: checking for max_fail_percentage 28011 1726882562.09034: done checking for max_fail_percentage 28011 1726882562.09034: checking to see if all hosts have failed and the running result is not ok 28011 1726882562.09035: done checking to see if all hosts have failed 28011 1726882562.09036: getting the remaining hosts for this loop 28011 1726882562.09037: done getting the remaining hosts for this loop 28011 1726882562.09040: getting the next task for host managed_node1 28011 1726882562.09043: done getting next task for host managed_node1 28011 1726882562.09044: ^ task is: None 28011 1726882562.09045: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882562.09046: done queuing things up, now waiting for results queue to drain 28011 1726882562.09047: results queue empty 28011 1726882562.09048: checking for any_errors_fatal 28011 1726882562.09048: done checking for any_errors_fatal 28011 1726882562.09049: checking for max_fail_percentage 28011 1726882562.09050: done checking for max_fail_percentage 28011 1726882562.09050: checking to see if all hosts have failed and the running result is not ok 28011 1726882562.09051: done checking to see if all hosts have failed 28011 1726882562.09052: getting the next task for host managed_node1 28011 1726882562.09054: done getting next task for host managed_node1 28011 1726882562.09055: ^ task is: None 28011 1726882562.09056: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882562.09090: in VariableManager get_vars() 28011 1726882562.09115: done with get_vars() 28011 1726882562.09121: in VariableManager get_vars() 28011 1726882562.09135: done with get_vars() 28011 1726882562.09140: variable 'omit' from source: magic vars 28011 1726882562.09167: in VariableManager get_vars() 28011 1726882562.09176: done with get_vars() 28011 1726882562.09199: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 28011 1726882562.09374: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28011 1726882562.09401: getting the remaining hosts for this loop 28011 1726882562.09403: done getting the remaining hosts for this loop 28011 1726882562.09405: getting the next task for host managed_node1 28011 1726882562.09408: done getting next task for host managed_node1 28011 1726882562.09410: ^ task is: TASK: Gathering Facts 28011 1726882562.09411: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882562.09413: getting variables 28011 1726882562.09414: in VariableManager get_vars() 28011 1726882562.09422: Calling all_inventory to load vars for managed_node1 28011 1726882562.09424: Calling groups_inventory to load vars for managed_node1 28011 1726882562.09426: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882562.09432: Calling all_plugins_play to load vars for managed_node1 28011 1726882562.09434: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882562.09437: Calling groups_plugins_play to load vars for managed_node1 28011 1726882562.10764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882562.12441: done with get_vars() 28011 1726882562.12466: done getting variables 28011 1726882562.12540: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 21:36:02 -0400 (0:00:00.597) 0:00:31.676 ****** 28011 1726882562.12564: entering _queue_task() for managed_node1/gather_facts 28011 1726882562.12949: worker is 1 (out of 1 available) 28011 1726882562.12960: exiting _queue_task() for managed_node1/gather_facts 28011 1726882562.12972: done queuing things up, now waiting for results queue to drain 28011 1726882562.12973: waiting for pending results... 28011 1726882562.13211: running TaskExecutor() for managed_node1/TASK: Gathering Facts 28011 1726882562.13272: in run() - task 12673a56-9f93-962d-7c65-00000000076f 28011 1726882562.13296: variable 'ansible_search_path' from source: unknown 28011 1726882562.13337: calling self._execute() 28011 1726882562.13432: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882562.13444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882562.13697: variable 'omit' from source: magic vars 28011 1726882562.13812: variable 'ansible_distribution_major_version' from source: facts 28011 1726882562.13872: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882562.13882: variable 'omit' from source: magic vars 28011 1726882562.13927: variable 'omit' from source: magic vars 28011 1726882562.13971: variable 'omit' from source: magic vars 28011 1726882562.14033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882562.14076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882562.14137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882562.14160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882562.14176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882562.14244: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882562.14255: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882562.14271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882562.14483: Set connection var ansible_connection to ssh 28011 1726882562.14539: Set connection var ansible_pipelining to False 28011 1726882562.14579: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882562.14595: Set connection var ansible_shell_executable to /bin/sh 28011 1726882562.14607: Set connection var ansible_timeout to 10 28011 1726882562.14616: Set connection var ansible_shell_type to sh 28011 1726882562.14641: variable 'ansible_shell_executable' from source: unknown 28011 1726882562.14649: variable 'ansible_connection' from source: unknown 28011 1726882562.14655: variable 'ansible_module_compression' from source: unknown 28011 1726882562.14666: variable 'ansible_shell_type' from source: unknown 28011 1726882562.14674: variable 'ansible_shell_executable' from source: unknown 28011 1726882562.14685: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882562.14697: variable 'ansible_pipelining' from source: unknown 28011 1726882562.14706: variable 'ansible_timeout' from source: unknown 28011 1726882562.14714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882562.14938: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882562.14955: variable 'omit' from source: magic vars 28011 1726882562.14965: starting attempt loop 28011 1726882562.14972: running the handler 28011 1726882562.15011: variable 'ansible_facts' from source: unknown 28011 1726882562.15038: _low_level_execute_command(): starting 28011 1726882562.15051: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882562.16401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882562.16406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882562.16408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882562.16410: stderr chunk (state=3): >>>debug2: match found <<< 28011 1726882562.16425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882562.16468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882562.16489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882562.16532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882562.16619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882562.18290: stdout chunk (state=3): >>>/root <<< 28011 1726882562.18481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882562.18485: stdout chunk (state=3): >>><<< 28011 1726882562.18490: stderr chunk (state=3): >>><<< 28011 1726882562.18617: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882562.18621: _low_level_execute_command(): starting 28011 1726882562.18624: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851 `" && echo ansible-tmp-1726882562.1852384-29443-195886687934851="` echo /root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851 `" ) && sleep 0' 28011 1726882562.19782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882562.19785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882562.19791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882562.19806: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882562.19809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882562.19955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882562.19959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882562.19989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882562.20058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882562.21923: stdout chunk (state=3): >>>ansible-tmp-1726882562.1852384-29443-195886687934851=/root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851 <<< 28011 1726882562.22104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882562.22135: stderr chunk (state=3): >>><<< 28011 1726882562.22152: stdout chunk (state=3): >>><<< 28011 1726882562.22178: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882562.1852384-29443-195886687934851=/root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882562.22225: variable 'ansible_module_compression' from source: unknown 28011 1726882562.22599: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28011 1726882562.22602: variable 'ansible_facts' from source: unknown 28011 1726882562.22604: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851/AnsiballZ_setup.py 28011 1726882562.22849: Sending initial data 28011 1726882562.22859: Sent initial data (154 bytes) 28011 1726882562.23812: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882562.23836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882562.23852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882562.23915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882562.23992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882562.25556: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882562.25612: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882562.25672: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpiqbvy9el /root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851/AnsiballZ_setup.py <<< 28011 1726882562.25709: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851/AnsiballZ_setup.py" <<< 28011 1726882562.25740: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpiqbvy9el" to remote "/root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851/AnsiballZ_setup.py" <<< 28011 1726882562.27637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882562.27641: stdout chunk (state=3): >>><<< 28011 1726882562.27644: stderr chunk (state=3): >>><<< 28011 1726882562.27734: done transferring module to remote 28011 1726882562.27859: _low_level_execute_command(): starting 28011 1726882562.27863: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851/ /root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851/AnsiballZ_setup.py && sleep 0' 28011 1726882562.28851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882562.28854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882562.28905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882562.29033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882562.29036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882562.29091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882562.29137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882562.31098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882562.31102: stdout chunk (state=3): >>><<< 28011 1726882562.31105: stderr chunk (state=3): >>><<< 28011 1726882562.31107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882562.31110: _low_level_execute_command(): starting 28011 1726882562.31113: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851/AnsiballZ_setup.py && sleep 0' 28011 1726882562.32179: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882562.32197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882562.32215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882562.32234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882562.32261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882562.32309: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882562.32371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882562.32391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882562.32406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882562.32484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882562.97383: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.44384765625, "5m": 0.3916015625, "15m": <<< 28011 1726882562.97397: stdout chunk (state=3): >>>0.21875}, "ansible_interfaces": ["lo", "ethtest0", "eth0", "peerethtest0"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "4e:7e:e5:ed:a3:c7", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::4c7e:e5ff:feed:a3c7", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "22:94:82:4a:26:63", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::2094:82ff:fe4a:2663", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off"<<< 28011 1726882562.97426: stdout chunk (state=3): >>>, "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_seg<<< 28011 1726882562.97446: stdout chunk (state=3): >>>mentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4c7e:e5ff:feed:a3c7", "fe80::2094:82ff:fe4a:2663", "fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223", "fe80::4c7e:e5ff:feed:a3c7"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2945, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 586, "free": 2945}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions":<<< 28011 1726882562.97456: stdout chunk (state=3): >>> {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 995, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789868032, "block_size": 4096, "block_total": 65519099, "block_available": 63913542, "block_used": 1605557, "inode_total": 131070960, "inode_available": 131029044, "inode_used": 41916, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "02", "epoch": "1726882562", "epoch_int": "1726882562", "date": "2024-09-20", "time": "21:36:02", "iso8601_micro": "2024-09-21T01:36:02.970303Z", "iso8601": "2024-09-21T01:36:02Z", "iso8601_basic": "20240920T213602970303", "iso8601_basic_short": "20240920T213602", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28011 1726882562.99497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882562.99526: stderr chunk (state=3): >>><<< 28011 1726882562.99529: stdout chunk (state=3): >>><<< 28011 1726882562.99575: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.44384765625, "5m": 0.3916015625, "15m": 0.21875}, "ansible_interfaces": ["lo", "ethtest0", "eth0", "peerethtest0"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "4e:7e:e5:ed:a3:c7", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::4c7e:e5ff:feed:a3c7", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "22:94:82:4a:26:63", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::2094:82ff:fe4a:2663", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4c7e:e5ff:feed:a3c7", "fe80::2094:82ff:fe4a:2663", "fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223", "fe80::4c7e:e5ff:feed:a3c7"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2945, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 586, "free": 2945}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 995, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789868032, "block_size": 4096, "block_total": 65519099, "block_available": 63913542, "block_used": 1605557, "inode_total": 131070960, "inode_available": 131029044, "inode_used": 41916, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "02", "epoch": "1726882562", "epoch_int": "1726882562", "date": "2024-09-20", "time": "21:36:02", "iso8601_micro": "2024-09-21T01:36:02.970303Z", "iso8601": "2024-09-21T01:36:02Z", "iso8601_basic": "20240920T213602970303", "iso8601_basic_short": "20240920T213602", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882562.99845: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882562.99867: _low_level_execute_command(): starting 28011 1726882562.99870: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882562.1852384-29443-195886687934851/ > /dev/null 2>&1 && sleep 0' 28011 1726882563.00330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882563.00334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882563.00336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882563.00338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882563.00385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882563.00396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882563.00398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882563.00440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882563.02279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882563.02307: stderr chunk (state=3): >>><<< 28011 1726882563.02311: stdout chunk (state=3): >>><<< 28011 1726882563.02324: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882563.02330: handler run complete 28011 1726882563.02422: variable 'ansible_facts' from source: unknown 28011 1726882563.02495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882563.02701: variable 'ansible_facts' from source: unknown 28011 1726882563.02759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882563.02860: attempt loop complete, returning result 28011 1726882563.02864: _execute() done 28011 1726882563.02866: dumping result to json 28011 1726882563.02891: done dumping result, returning 28011 1726882563.02898: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-962d-7c65-00000000076f] 28011 1726882563.02904: sending task result for task 12673a56-9f93-962d-7c65-00000000076f 28011 1726882563.03239: done sending task result for task 12673a56-9f93-962d-7c65-00000000076f 28011 1726882563.03241: WORKER PROCESS EXITING ok: [managed_node1] 28011 1726882563.03515: no more pending results, returning what we have 28011 1726882563.03517: results queue empty 28011 1726882563.03518: checking for any_errors_fatal 28011 1726882563.03519: done checking for any_errors_fatal 28011 1726882563.03519: checking for max_fail_percentage 28011 1726882563.03520: done checking for max_fail_percentage 28011 1726882563.03521: checking to see if all hosts have failed and the running result is not ok 28011 1726882563.03521: done checking to see if all hosts have failed 28011 1726882563.03522: getting the remaining hosts for this loop 28011 1726882563.03523: done getting the remaining hosts for this loop 28011 1726882563.03525: getting the next task for host managed_node1 28011 1726882563.03528: done getting next task for host managed_node1 28011 1726882563.03529: ^ task is: TASK: meta (flush_handlers) 28011 1726882563.03530: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882563.03533: getting variables 28011 1726882563.03534: in VariableManager get_vars() 28011 1726882563.03553: Calling all_inventory to load vars for managed_node1 28011 1726882563.03555: Calling groups_inventory to load vars for managed_node1 28011 1726882563.03557: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882563.03565: Calling all_plugins_play to load vars for managed_node1 28011 1726882563.03567: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882563.03568: Calling groups_plugins_play to load vars for managed_node1 28011 1726882563.04309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882563.05765: done with get_vars() 28011 1726882563.05781: done getting variables 28011 1726882563.05833: in VariableManager get_vars() 28011 1726882563.05841: Calling all_inventory to load vars for managed_node1 28011 1726882563.05842: Calling groups_inventory to load vars for managed_node1 28011 1726882563.05844: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882563.05847: Calling all_plugins_play to load vars for managed_node1 28011 1726882563.05849: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882563.05850: Calling groups_plugins_play to load vars for managed_node1 28011 1726882563.06498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882563.07467: done with get_vars() 28011 1726882563.07501: done queuing things up, now waiting for results queue to drain 28011 1726882563.07503: results queue empty 28011 1726882563.07504: checking for any_errors_fatal 28011 1726882563.07508: done checking for any_errors_fatal 28011 1726882563.07509: checking for max_fail_percentage 28011 1726882563.07510: done checking for max_fail_percentage 28011 1726882563.07515: checking to see if all hosts have failed and the running result is not ok 28011 1726882563.07516: done checking to see if all hosts have failed 28011 1726882563.07517: getting the remaining hosts for this loop 28011 1726882563.07518: done getting the remaining hosts for this loop 28011 1726882563.07521: getting the next task for host managed_node1 28011 1726882563.07524: done getting next task for host managed_node1 28011 1726882563.07527: ^ task is: TASK: Include the task 'delete_interface.yml' 28011 1726882563.07528: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882563.07530: getting variables 28011 1726882563.07531: in VariableManager get_vars() 28011 1726882563.07539: Calling all_inventory to load vars for managed_node1 28011 1726882563.07540: Calling groups_inventory to load vars for managed_node1 28011 1726882563.07542: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882563.07547: Calling all_plugins_play to load vars for managed_node1 28011 1726882563.07549: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882563.07551: Calling groups_plugins_play to load vars for managed_node1 28011 1726882563.08831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882563.10440: done with get_vars() 28011 1726882563.10460: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 21:36:03 -0400 (0:00:00.979) 0:00:32.656 ****** 28011 1726882563.10541: entering _queue_task() for managed_node1/include_tasks 28011 1726882563.10901: worker is 1 (out of 1 available) 28011 1726882563.10913: exiting _queue_task() for managed_node1/include_tasks 28011 1726882563.10924: done queuing things up, now waiting for results queue to drain 28011 1726882563.10925: waiting for pending results... 28011 1726882563.11316: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 28011 1726882563.11322: in run() - task 12673a56-9f93-962d-7c65-0000000000cf 28011 1726882563.11411: variable 'ansible_search_path' from source: unknown 28011 1726882563.11414: calling self._execute() 28011 1726882563.11476: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882563.11488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882563.11505: variable 'omit' from source: magic vars 28011 1726882563.11914: variable 'ansible_distribution_major_version' from source: facts 28011 1726882563.11933: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882563.11945: _execute() done 28011 1726882563.11959: dumping result to json 28011 1726882563.11992: done dumping result, returning 28011 1726882563.12064: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [12673a56-9f93-962d-7c65-0000000000cf] 28011 1726882563.12067: sending task result for task 12673a56-9f93-962d-7c65-0000000000cf 28011 1726882563.12147: done sending task result for task 12673a56-9f93-962d-7c65-0000000000cf 28011 1726882563.12150: WORKER PROCESS EXITING 28011 1726882563.12201: no more pending results, returning what we have 28011 1726882563.12207: in VariableManager get_vars() 28011 1726882563.12247: Calling all_inventory to load vars for managed_node1 28011 1726882563.12250: Calling groups_inventory to load vars for managed_node1 28011 1726882563.12254: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882563.12269: Calling all_plugins_play to load vars for managed_node1 28011 1726882563.12273: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882563.12276: Calling groups_plugins_play to load vars for managed_node1 28011 1726882563.14212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882563.15844: done with get_vars() 28011 1726882563.15867: variable 'ansible_search_path' from source: unknown 28011 1726882563.15883: we have included files to process 28011 1726882563.15884: generating all_blocks data 28011 1726882563.15888: done generating all_blocks data 28011 1726882563.15889: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28011 1726882563.15890: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28011 1726882563.15895: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28011 1726882563.16127: done processing included file 28011 1726882563.16129: iterating over new_blocks loaded from include file 28011 1726882563.16130: in VariableManager get_vars() 28011 1726882563.16143: done with get_vars() 28011 1726882563.16144: filtering new block on tags 28011 1726882563.16159: done filtering new block on tags 28011 1726882563.16161: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 28011 1726882563.16166: extending task lists for all hosts with included blocks 28011 1726882563.16199: done extending task lists 28011 1726882563.16201: done processing included files 28011 1726882563.16202: results queue empty 28011 1726882563.16202: checking for any_errors_fatal 28011 1726882563.16204: done checking for any_errors_fatal 28011 1726882563.16205: checking for max_fail_percentage 28011 1726882563.16206: done checking for max_fail_percentage 28011 1726882563.16206: checking to see if all hosts have failed and the running result is not ok 28011 1726882563.16207: done checking to see if all hosts have failed 28011 1726882563.16208: getting the remaining hosts for this loop 28011 1726882563.16209: done getting the remaining hosts for this loop 28011 1726882563.16211: getting the next task for host managed_node1 28011 1726882563.16215: done getting next task for host managed_node1 28011 1726882563.16217: ^ task is: TASK: Remove test interface if necessary 28011 1726882563.16219: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882563.16221: getting variables 28011 1726882563.16222: in VariableManager get_vars() 28011 1726882563.16230: Calling all_inventory to load vars for managed_node1 28011 1726882563.16233: Calling groups_inventory to load vars for managed_node1 28011 1726882563.16235: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882563.16240: Calling all_plugins_play to load vars for managed_node1 28011 1726882563.16243: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882563.16245: Calling groups_plugins_play to load vars for managed_node1 28011 1726882563.17479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882563.19326: done with get_vars() 28011 1726882563.19355: done getting variables 28011 1726882563.19412: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:36:03 -0400 (0:00:00.089) 0:00:32.745 ****** 28011 1726882563.19445: entering _queue_task() for managed_node1/command 28011 1726882563.19924: worker is 1 (out of 1 available) 28011 1726882563.19935: exiting _queue_task() for managed_node1/command 28011 1726882563.19946: done queuing things up, now waiting for results queue to drain 28011 1726882563.19948: waiting for pending results... 28011 1726882563.20322: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 28011 1726882563.20399: in run() - task 12673a56-9f93-962d-7c65-000000000780 28011 1726882563.20403: variable 'ansible_search_path' from source: unknown 28011 1726882563.20405: variable 'ansible_search_path' from source: unknown 28011 1726882563.20415: calling self._execute() 28011 1726882563.20521: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882563.20537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882563.20551: variable 'omit' from source: magic vars 28011 1726882563.20946: variable 'ansible_distribution_major_version' from source: facts 28011 1726882563.20968: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882563.21198: variable 'omit' from source: magic vars 28011 1726882563.21201: variable 'omit' from source: magic vars 28011 1726882563.21203: variable 'interface' from source: set_fact 28011 1726882563.21205: variable 'omit' from source: magic vars 28011 1726882563.21208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882563.21244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882563.21269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882563.21297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882563.21313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882563.21350: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882563.21359: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882563.21366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882563.21473: Set connection var ansible_connection to ssh 28011 1726882563.21486: Set connection var ansible_pipelining to False 28011 1726882563.21502: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882563.21512: Set connection var ansible_shell_executable to /bin/sh 28011 1726882563.21523: Set connection var ansible_timeout to 10 28011 1726882563.21534: Set connection var ansible_shell_type to sh 28011 1726882563.21564: variable 'ansible_shell_executable' from source: unknown 28011 1726882563.21571: variable 'ansible_connection' from source: unknown 28011 1726882563.21578: variable 'ansible_module_compression' from source: unknown 28011 1726882563.21584: variable 'ansible_shell_type' from source: unknown 28011 1726882563.21595: variable 'ansible_shell_executable' from source: unknown 28011 1726882563.21602: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882563.21610: variable 'ansible_pipelining' from source: unknown 28011 1726882563.21617: variable 'ansible_timeout' from source: unknown 28011 1726882563.21625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882563.21776: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882563.21796: variable 'omit' from source: magic vars 28011 1726882563.21806: starting attempt loop 28011 1726882563.21812: running the handler 28011 1726882563.21835: _low_level_execute_command(): starting 28011 1726882563.21848: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882563.22584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882563.22613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882563.22700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882563.22735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882563.22752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882563.22772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882563.22860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882563.24560: stdout chunk (state=3): >>>/root <<< 28011 1726882563.24724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882563.24728: stdout chunk (state=3): >>><<< 28011 1726882563.24730: stderr chunk (state=3): >>><<< 28011 1726882563.24849: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882563.24853: _low_level_execute_command(): starting 28011 1726882563.24855: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195 `" && echo ansible-tmp-1726882563.2475467-29490-183452801930195="` echo /root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195 `" ) && sleep 0' 28011 1726882563.25390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882563.25407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882563.25428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882563.25446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882563.25463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882563.25483: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882563.25547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882563.25587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882563.25606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882563.25630: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882563.25711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882563.27590: stdout chunk (state=3): >>>ansible-tmp-1726882563.2475467-29490-183452801930195=/root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195 <<< 28011 1726882563.27733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882563.27750: stderr chunk (state=3): >>><<< 28011 1726882563.27760: stdout chunk (state=3): >>><<< 28011 1726882563.27786: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882563.2475467-29490-183452801930195=/root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882563.27999: variable 'ansible_module_compression' from source: unknown 28011 1726882563.28002: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28011 1726882563.28005: variable 'ansible_facts' from source: unknown 28011 1726882563.28007: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195/AnsiballZ_command.py 28011 1726882563.28257: Sending initial data 28011 1726882563.28260: Sent initial data (156 bytes) 28011 1726882563.28834: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882563.28848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882563.28861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882563.28878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882563.28907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882563.28919: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882563.29018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882563.29034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882563.29049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882563.29070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882563.29135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882563.30730: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882563.30787: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882563.30851: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpcq8261o9 /root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195/AnsiballZ_command.py <<< 28011 1726882563.30916: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195/AnsiballZ_command.py" <<< 28011 1726882563.30957: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpcq8261o9" to remote "/root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195/AnsiballZ_command.py" <<< 28011 1726882563.31989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882563.32089: stderr chunk (state=3): >>><<< 28011 1726882563.32160: stdout chunk (state=3): >>><<< 28011 1726882563.32171: done transferring module to remote 28011 1726882563.32191: _low_level_execute_command(): starting 28011 1726882563.32205: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195/ /root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195/AnsiballZ_command.py && sleep 0' 28011 1726882563.33052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882563.33064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882563.33109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882563.33151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882563.33212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882563.33236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882563.33310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882563.35141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882563.35144: stdout chunk (state=3): >>><<< 28011 1726882563.35146: stderr chunk (state=3): >>><<< 28011 1726882563.35277: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882563.35280: _low_level_execute_command(): starting 28011 1726882563.35283: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195/AnsiballZ_command.py && sleep 0' 28011 1726882563.36453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882563.36773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882563.36785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882563.52891: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 21:36:03.515583", "end": "2024-09-20 21:36:03.521878", "delta": "0:00:00.006295", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882563.54670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882563.54770: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 28011 1726882563.54784: stdout chunk (state=3): >>><<< 28011 1726882563.54820: stderr chunk (state=3): >>><<< 28011 1726882563.54845: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 21:36:03.515583", "end": "2024-09-20 21:36:03.521878", "delta": "0:00:00.006295", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882563.54921: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882563.54945: _low_level_execute_command(): starting 28011 1726882563.54956: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882563.2475467-29490-183452801930195/ > /dev/null 2>&1 && sleep 0' 28011 1726882563.56304: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882563.56321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882563.56338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882563.56358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882563.56509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882563.56734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882563.56809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882563.58680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882563.58690: stdout chunk (state=3): >>><<< 28011 1726882563.58707: stderr chunk (state=3): >>><<< 28011 1726882563.58739: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882563.58749: handler run complete 28011 1726882563.58781: Evaluated conditional (False): False 28011 1726882563.58901: attempt loop complete, returning result 28011 1726882563.58904: _execute() done 28011 1726882563.58906: dumping result to json 28011 1726882563.58908: done dumping result, returning 28011 1726882563.58910: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [12673a56-9f93-962d-7c65-000000000780] 28011 1726882563.58912: sending task result for task 12673a56-9f93-962d-7c65-000000000780 ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.006295", "end": "2024-09-20 21:36:03.521878", "rc": 0, "start": "2024-09-20 21:36:03.515583" } 28011 1726882563.59375: no more pending results, returning what we have 28011 1726882563.59379: results queue empty 28011 1726882563.59380: checking for any_errors_fatal 28011 1726882563.59381: done checking for any_errors_fatal 28011 1726882563.59382: checking for max_fail_percentage 28011 1726882563.59384: done checking for max_fail_percentage 28011 1726882563.59385: checking to see if all hosts have failed and the running result is not ok 28011 1726882563.59386: done checking to see if all hosts have failed 28011 1726882563.59386: getting the remaining hosts for this loop 28011 1726882563.59388: done getting the remaining hosts for this loop 28011 1726882563.59391: getting the next task for host managed_node1 28011 1726882563.59401: done getting next task for host managed_node1 28011 1726882563.59403: ^ task is: TASK: meta (flush_handlers) 28011 1726882563.59405: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882563.59410: getting variables 28011 1726882563.59411: in VariableManager get_vars() 28011 1726882563.59442: Calling all_inventory to load vars for managed_node1 28011 1726882563.59444: Calling groups_inventory to load vars for managed_node1 28011 1726882563.59448: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882563.59460: Calling all_plugins_play to load vars for managed_node1 28011 1726882563.59463: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882563.59466: Calling groups_plugins_play to load vars for managed_node1 28011 1726882563.60230: done sending task result for task 12673a56-9f93-962d-7c65-000000000780 28011 1726882563.60234: WORKER PROCESS EXITING 28011 1726882563.61502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882563.63295: done with get_vars() 28011 1726882563.63326: done getting variables 28011 1726882563.63440: in VariableManager get_vars() 28011 1726882563.63451: Calling all_inventory to load vars for managed_node1 28011 1726882563.63454: Calling groups_inventory to load vars for managed_node1 28011 1726882563.63459: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882563.63465: Calling all_plugins_play to load vars for managed_node1 28011 1726882563.63468: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882563.63471: Calling groups_plugins_play to load vars for managed_node1 28011 1726882563.65649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882563.67414: done with get_vars() 28011 1726882563.67441: done queuing things up, now waiting for results queue to drain 28011 1726882563.67443: results queue empty 28011 1726882563.67444: checking for any_errors_fatal 28011 1726882563.67447: done checking for any_errors_fatal 28011 1726882563.67448: checking for max_fail_percentage 28011 1726882563.67449: done checking for max_fail_percentage 28011 1726882563.67449: checking to see if all hosts have failed and the running result is not ok 28011 1726882563.67450: done checking to see if all hosts have failed 28011 1726882563.67451: getting the remaining hosts for this loop 28011 1726882563.67452: done getting the remaining hosts for this loop 28011 1726882563.67454: getting the next task for host managed_node1 28011 1726882563.67458: done getting next task for host managed_node1 28011 1726882563.67459: ^ task is: TASK: meta (flush_handlers) 28011 1726882563.67461: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882563.67463: getting variables 28011 1726882563.67464: in VariableManager get_vars() 28011 1726882563.67472: Calling all_inventory to load vars for managed_node1 28011 1726882563.67474: Calling groups_inventory to load vars for managed_node1 28011 1726882563.67476: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882563.67481: Calling all_plugins_play to load vars for managed_node1 28011 1726882563.67484: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882563.67486: Calling groups_plugins_play to load vars for managed_node1 28011 1726882563.68859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882563.70741: done with get_vars() 28011 1726882563.70760: done getting variables 28011 1726882563.70817: in VariableManager get_vars() 28011 1726882563.70832: Calling all_inventory to load vars for managed_node1 28011 1726882563.70834: Calling groups_inventory to load vars for managed_node1 28011 1726882563.70837: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882563.70842: Calling all_plugins_play to load vars for managed_node1 28011 1726882563.70844: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882563.70847: Calling groups_plugins_play to load vars for managed_node1 28011 1726882563.71986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882563.73561: done with get_vars() 28011 1726882563.73601: done queuing things up, now waiting for results queue to drain 28011 1726882563.73603: results queue empty 28011 1726882563.73604: checking for any_errors_fatal 28011 1726882563.73605: done checking for any_errors_fatal 28011 1726882563.73606: checking for max_fail_percentage 28011 1726882563.73607: done checking for max_fail_percentage 28011 1726882563.73608: checking to see if all hosts have failed and the running result is not ok 28011 1726882563.73608: done checking to see if all hosts have failed 28011 1726882563.73609: getting the remaining hosts for this loop 28011 1726882563.73610: done getting the remaining hosts for this loop 28011 1726882563.73614: getting the next task for host managed_node1 28011 1726882563.73617: done getting next task for host managed_node1 28011 1726882563.73618: ^ task is: None 28011 1726882563.73619: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882563.73621: done queuing things up, now waiting for results queue to drain 28011 1726882563.73621: results queue empty 28011 1726882563.73622: checking for any_errors_fatal 28011 1726882563.73623: done checking for any_errors_fatal 28011 1726882563.73623: checking for max_fail_percentage 28011 1726882563.73624: done checking for max_fail_percentage 28011 1726882563.73625: checking to see if all hosts have failed and the running result is not ok 28011 1726882563.73625: done checking to see if all hosts have failed 28011 1726882563.73627: getting the next task for host managed_node1 28011 1726882563.73629: done getting next task for host managed_node1 28011 1726882563.73630: ^ task is: None 28011 1726882563.73631: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882563.73676: in VariableManager get_vars() 28011 1726882563.73704: done with get_vars() 28011 1726882563.73713: in VariableManager get_vars() 28011 1726882563.73731: done with get_vars() 28011 1726882563.73734: variable 'omit' from source: magic vars 28011 1726882563.73823: variable 'profile' from source: play vars 28011 1726882563.73896: in VariableManager get_vars() 28011 1726882563.73907: done with get_vars() 28011 1726882563.73923: variable 'omit' from source: magic vars 28011 1726882563.73966: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 28011 1726882563.74403: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28011 1726882563.74423: getting the remaining hosts for this loop 28011 1726882563.74424: done getting the remaining hosts for this loop 28011 1726882563.74426: getting the next task for host managed_node1 28011 1726882563.74427: done getting next task for host managed_node1 28011 1726882563.74429: ^ task is: TASK: Gathering Facts 28011 1726882563.74430: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882563.74431: getting variables 28011 1726882563.74432: in VariableManager get_vars() 28011 1726882563.74439: Calling all_inventory to load vars for managed_node1 28011 1726882563.74440: Calling groups_inventory to load vars for managed_node1 28011 1726882563.74442: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882563.74446: Calling all_plugins_play to load vars for managed_node1 28011 1726882563.74448: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882563.74450: Calling groups_plugins_play to load vars for managed_node1 28011 1726882563.75188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882563.76482: done with get_vars() 28011 1726882563.76501: done getting variables 28011 1726882563.76536: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 21:36:03 -0400 (0:00:00.571) 0:00:33.316 ****** 28011 1726882563.76554: entering _queue_task() for managed_node1/gather_facts 28011 1726882563.76792: worker is 1 (out of 1 available) 28011 1726882563.76810: exiting _queue_task() for managed_node1/gather_facts 28011 1726882563.76823: done queuing things up, now waiting for results queue to drain 28011 1726882563.76825: waiting for pending results... 28011 1726882563.77002: running TaskExecutor() for managed_node1/TASK: Gathering Facts 28011 1726882563.77080: in run() - task 12673a56-9f93-962d-7c65-00000000078e 28011 1726882563.77095: variable 'ansible_search_path' from source: unknown 28011 1726882563.77124: calling self._execute() 28011 1726882563.77206: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882563.77210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882563.77220: variable 'omit' from source: magic vars 28011 1726882563.77508: variable 'ansible_distribution_major_version' from source: facts 28011 1726882563.77517: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882563.77523: variable 'omit' from source: magic vars 28011 1726882563.77539: variable 'omit' from source: magic vars 28011 1726882563.77564: variable 'omit' from source: magic vars 28011 1726882563.77604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882563.77630: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882563.77646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882563.77659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882563.77669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882563.77697: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882563.77702: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882563.77705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882563.77778: Set connection var ansible_connection to ssh 28011 1726882563.77785: Set connection var ansible_pipelining to False 28011 1726882563.77796: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882563.77801: Set connection var ansible_shell_executable to /bin/sh 28011 1726882563.77808: Set connection var ansible_timeout to 10 28011 1726882563.77812: Set connection var ansible_shell_type to sh 28011 1726882563.77834: variable 'ansible_shell_executable' from source: unknown 28011 1726882563.77837: variable 'ansible_connection' from source: unknown 28011 1726882563.77840: variable 'ansible_module_compression' from source: unknown 28011 1726882563.77843: variable 'ansible_shell_type' from source: unknown 28011 1726882563.77845: variable 'ansible_shell_executable' from source: unknown 28011 1726882563.77848: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882563.77852: variable 'ansible_pipelining' from source: unknown 28011 1726882563.77854: variable 'ansible_timeout' from source: unknown 28011 1726882563.77856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882563.77987: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882563.78001: variable 'omit' from source: magic vars 28011 1726882563.78004: starting attempt loop 28011 1726882563.78007: running the handler 28011 1726882563.78021: variable 'ansible_facts' from source: unknown 28011 1726882563.78043: _low_level_execute_command(): starting 28011 1726882563.78046: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882563.78723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882563.78740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882563.78757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882563.78826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882563.80394: stdout chunk (state=3): >>>/root <<< 28011 1726882563.80530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882563.80541: stdout chunk (state=3): >>><<< 28011 1726882563.80562: stderr chunk (state=3): >>><<< 28011 1726882563.80658: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882563.80661: _low_level_execute_command(): starting 28011 1726882563.80664: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032 `" && echo ansible-tmp-1726882563.8058593-29523-270328696429032="` echo /root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032 `" ) && sleep 0' 28011 1726882563.81222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882563.81244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882563.81262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882563.81284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882563.81306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882563.81365: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882563.81424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882563.81451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882563.81482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882563.81547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882563.83424: stdout chunk (state=3): >>>ansible-tmp-1726882563.8058593-29523-270328696429032=/root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032 <<< 28011 1726882563.83581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882563.83585: stdout chunk (state=3): >>><<< 28011 1726882563.83587: stderr chunk (state=3): >>><<< 28011 1726882563.83606: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882563.8058593-29523-270328696429032=/root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882563.83798: variable 'ansible_module_compression' from source: unknown 28011 1726882563.83802: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28011 1726882563.83804: variable 'ansible_facts' from source: unknown 28011 1726882563.83984: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032/AnsiballZ_setup.py 28011 1726882563.84158: Sending initial data 28011 1726882563.84168: Sent initial data (154 bytes) 28011 1726882563.84763: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882563.84873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882563.84896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882563.84970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882563.86525: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882563.86564: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882563.86603: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmps16w6b28 /root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032/AnsiballZ_setup.py <<< 28011 1726882563.86625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032/AnsiballZ_setup.py" <<< 28011 1726882563.86817: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmps16w6b28" to remote "/root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032/AnsiballZ_setup.py" <<< 28011 1726882563.88428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882563.88473: stderr chunk (state=3): >>><<< 28011 1726882563.88482: stdout chunk (state=3): >>><<< 28011 1726882563.88510: done transferring module to remote 28011 1726882563.88523: _low_level_execute_command(): starting 28011 1726882563.88532: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032/ /root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032/AnsiballZ_setup.py && sleep 0' 28011 1726882563.89035: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882563.89047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882563.89051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882563.89135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882563.89184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882563.89204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882563.89253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882563.89367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882563.91227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882563.91234: stdout chunk (state=3): >>><<< 28011 1726882563.91236: stderr chunk (state=3): >>><<< 28011 1726882563.91239: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882563.91241: _low_level_execute_command(): starting 28011 1726882563.91243: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032/AnsiballZ_setup.py && sleep 0' 28011 1726882563.92020: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882563.92087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882563.92112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882563.92132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882563.92252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882564.54607: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.44384765625, "5m": 0.3916015625, "15m": 0.21875}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "r<<< 28011 1726882564.54716: stdout chunk (state=3): >>>oot", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2955, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 576, "free": 2955}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 997, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789868032, "block_size": 4096, "block_total": 65519099, "block_available": 63913542, "block_used": 1605557, "inode_total": 131070960, "inode_available": 131029044, "inode_used": 41916, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "04", "epoch": "1726882564", "epoch_int": "1726882564", "date": "2024-09-20", "time": "21:36:04", "iso8601_micro": "2024-09-21T01:36:04.542965Z", "iso8601": "2024-09-21T01:36:04Z", "iso8601_basic": "20240920T213604542965", "iso8601_basic_short": "20240920T213604", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28011 1726882564.57101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882564.57106: stdout chunk (state=3): >>><<< 28011 1726882564.57108: stderr chunk (state=3): >>><<< 28011 1726882564.57111: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.44384765625, "5m": 0.3916015625, "15m": 0.21875}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2955, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 576, "free": 2955}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 997, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789868032, "block_size": 4096, "block_total": 65519099, "block_available": 63913542, "block_used": 1605557, "inode_total": 131070960, "inode_available": 131029044, "inode_used": 41916, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "04", "epoch": "1726882564", "epoch_int": "1726882564", "date": "2024-09-20", "time": "21:36:04", "iso8601_micro": "2024-09-21T01:36:04.542965Z", "iso8601": "2024-09-21T01:36:04Z", "iso8601_basic": "20240920T213604542965", "iso8601_basic_short": "20240920T213604", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882564.57332: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882564.57478: _low_level_execute_command(): starting 28011 1726882564.57507: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882563.8058593-29523-270328696429032/ > /dev/null 2>&1 && sleep 0' 28011 1726882564.58691: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882564.58708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882564.58769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882564.58919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882564.58986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882564.59003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882564.59112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882564.60884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882564.61002: stderr chunk (state=3): >>><<< 28011 1726882564.61015: stdout chunk (state=3): >>><<< 28011 1726882564.61055: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882564.61260: handler run complete 28011 1726882564.61421: variable 'ansible_facts' from source: unknown 28011 1726882564.61656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882564.62276: variable 'ansible_facts' from source: unknown 28011 1726882564.62489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882564.62742: attempt loop complete, returning result 28011 1726882564.62785: _execute() done 28011 1726882564.62801: dumping result to json 28011 1726882564.62921: done dumping result, returning 28011 1726882564.62934: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-962d-7c65-00000000078e] 28011 1726882564.62943: sending task result for task 12673a56-9f93-962d-7c65-00000000078e ok: [managed_node1] 28011 1726882564.64258: no more pending results, returning what we have 28011 1726882564.64261: results queue empty 28011 1726882564.64262: checking for any_errors_fatal 28011 1726882564.64264: done checking for any_errors_fatal 28011 1726882564.64264: checking for max_fail_percentage 28011 1726882564.64266: done checking for max_fail_percentage 28011 1726882564.64267: checking to see if all hosts have failed and the running result is not ok 28011 1726882564.64268: done checking to see if all hosts have failed 28011 1726882564.64268: getting the remaining hosts for this loop 28011 1726882564.64270: done getting the remaining hosts for this loop 28011 1726882564.64274: getting the next task for host managed_node1 28011 1726882564.64279: done getting next task for host managed_node1 28011 1726882564.64281: ^ task is: TASK: meta (flush_handlers) 28011 1726882564.64283: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882564.64287: getting variables 28011 1726882564.64288: in VariableManager get_vars() 28011 1726882564.64322: Calling all_inventory to load vars for managed_node1 28011 1726882564.64325: Calling groups_inventory to load vars for managed_node1 28011 1726882564.64328: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882564.64341: Calling all_plugins_play to load vars for managed_node1 28011 1726882564.64344: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882564.64348: Calling groups_plugins_play to load vars for managed_node1 28011 1726882564.65122: done sending task result for task 12673a56-9f93-962d-7c65-00000000078e 28011 1726882564.65125: WORKER PROCESS EXITING 28011 1726882564.67288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882564.69308: done with get_vars() 28011 1726882564.69337: done getting variables 28011 1726882564.69433: in VariableManager get_vars() 28011 1726882564.69447: Calling all_inventory to load vars for managed_node1 28011 1726882564.69449: Calling groups_inventory to load vars for managed_node1 28011 1726882564.69451: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882564.69457: Calling all_plugins_play to load vars for managed_node1 28011 1726882564.69459: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882564.69461: Calling groups_plugins_play to load vars for managed_node1 28011 1726882564.71205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882564.72906: done with get_vars() 28011 1726882564.72937: done queuing things up, now waiting for results queue to drain 28011 1726882564.72940: results queue empty 28011 1726882564.72940: checking for any_errors_fatal 28011 1726882564.72945: done checking for any_errors_fatal 28011 1726882564.72946: checking for max_fail_percentage 28011 1726882564.72947: done checking for max_fail_percentage 28011 1726882564.72952: checking to see if all hosts have failed and the running result is not ok 28011 1726882564.72953: done checking to see if all hosts have failed 28011 1726882564.72954: getting the remaining hosts for this loop 28011 1726882564.72955: done getting the remaining hosts for this loop 28011 1726882564.72958: getting the next task for host managed_node1 28011 1726882564.72962: done getting next task for host managed_node1 28011 1726882564.72965: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28011 1726882564.72967: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882564.72977: getting variables 28011 1726882564.72978: in VariableManager get_vars() 28011 1726882564.72992: Calling all_inventory to load vars for managed_node1 28011 1726882564.72996: Calling groups_inventory to load vars for managed_node1 28011 1726882564.72998: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882564.73003: Calling all_plugins_play to load vars for managed_node1 28011 1726882564.73006: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882564.73008: Calling groups_plugins_play to load vars for managed_node1 28011 1726882564.74234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882564.81623: done with get_vars() 28011 1726882564.81643: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:04 -0400 (0:00:01.051) 0:00:34.368 ****** 28011 1726882564.81703: entering _queue_task() for managed_node1/include_tasks 28011 1726882564.82043: worker is 1 (out of 1 available) 28011 1726882564.82055: exiting _queue_task() for managed_node1/include_tasks 28011 1726882564.82067: done queuing things up, now waiting for results queue to drain 28011 1726882564.82069: waiting for pending results... 28011 1726882564.82353: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28011 1726882564.82472: in run() - task 12673a56-9f93-962d-7c65-0000000000d7 28011 1726882564.82497: variable 'ansible_search_path' from source: unknown 28011 1726882564.82505: variable 'ansible_search_path' from source: unknown 28011 1726882564.82552: calling self._execute() 28011 1726882564.82663: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882564.82678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882564.82699: variable 'omit' from source: magic vars 28011 1726882564.83106: variable 'ansible_distribution_major_version' from source: facts 28011 1726882564.83115: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882564.83121: _execute() done 28011 1726882564.83124: dumping result to json 28011 1726882564.83128: done dumping result, returning 28011 1726882564.83135: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-962d-7c65-0000000000d7] 28011 1726882564.83140: sending task result for task 12673a56-9f93-962d-7c65-0000000000d7 28011 1726882564.83231: done sending task result for task 12673a56-9f93-962d-7c65-0000000000d7 28011 1726882564.83234: WORKER PROCESS EXITING 28011 1726882564.83308: no more pending results, returning what we have 28011 1726882564.83313: in VariableManager get_vars() 28011 1726882564.83355: Calling all_inventory to load vars for managed_node1 28011 1726882564.83358: Calling groups_inventory to load vars for managed_node1 28011 1726882564.83360: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882564.83372: Calling all_plugins_play to load vars for managed_node1 28011 1726882564.83376: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882564.83379: Calling groups_plugins_play to load vars for managed_node1 28011 1726882564.84782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882564.85754: done with get_vars() 28011 1726882564.85768: variable 'ansible_search_path' from source: unknown 28011 1726882564.85769: variable 'ansible_search_path' from source: unknown 28011 1726882564.85794: we have included files to process 28011 1726882564.85795: generating all_blocks data 28011 1726882564.85796: done generating all_blocks data 28011 1726882564.85797: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28011 1726882564.85798: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28011 1726882564.85799: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28011 1726882564.86176: done processing included file 28011 1726882564.86177: iterating over new_blocks loaded from include file 28011 1726882564.86178: in VariableManager get_vars() 28011 1726882564.86196: done with get_vars() 28011 1726882564.86198: filtering new block on tags 28011 1726882564.86208: done filtering new block on tags 28011 1726882564.86209: in VariableManager get_vars() 28011 1726882564.86224: done with get_vars() 28011 1726882564.86225: filtering new block on tags 28011 1726882564.86236: done filtering new block on tags 28011 1726882564.86237: in VariableManager get_vars() 28011 1726882564.86248: done with get_vars() 28011 1726882564.86249: filtering new block on tags 28011 1726882564.86257: done filtering new block on tags 28011 1726882564.86259: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 28011 1726882564.86262: extending task lists for all hosts with included blocks 28011 1726882564.86505: done extending task lists 28011 1726882564.86507: done processing included files 28011 1726882564.86508: results queue empty 28011 1726882564.86508: checking for any_errors_fatal 28011 1726882564.86513: done checking for any_errors_fatal 28011 1726882564.86514: checking for max_fail_percentage 28011 1726882564.86515: done checking for max_fail_percentage 28011 1726882564.86516: checking to see if all hosts have failed and the running result is not ok 28011 1726882564.86517: done checking to see if all hosts have failed 28011 1726882564.86518: getting the remaining hosts for this loop 28011 1726882564.86519: done getting the remaining hosts for this loop 28011 1726882564.86521: getting the next task for host managed_node1 28011 1726882564.86525: done getting next task for host managed_node1 28011 1726882564.86527: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28011 1726882564.86529: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882564.86535: getting variables 28011 1726882564.86535: in VariableManager get_vars() 28011 1726882564.86544: Calling all_inventory to load vars for managed_node1 28011 1726882564.86546: Calling groups_inventory to load vars for managed_node1 28011 1726882564.86547: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882564.86552: Calling all_plugins_play to load vars for managed_node1 28011 1726882564.86557: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882564.86560: Calling groups_plugins_play to load vars for managed_node1 28011 1726882564.87657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882564.88873: done with get_vars() 28011 1726882564.88890: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:04 -0400 (0:00:00.072) 0:00:34.440 ****** 28011 1726882564.88938: entering _queue_task() for managed_node1/setup 28011 1726882564.89176: worker is 1 (out of 1 available) 28011 1726882564.89190: exiting _queue_task() for managed_node1/setup 28011 1726882564.89203: done queuing things up, now waiting for results queue to drain 28011 1726882564.89205: waiting for pending results... 28011 1726882564.89376: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28011 1726882564.89476: in run() - task 12673a56-9f93-962d-7c65-0000000007cf 28011 1726882564.89486: variable 'ansible_search_path' from source: unknown 28011 1726882564.89490: variable 'ansible_search_path' from source: unknown 28011 1726882564.89525: calling self._execute() 28011 1726882564.89601: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882564.89605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882564.89616: variable 'omit' from source: magic vars 28011 1726882564.89882: variable 'ansible_distribution_major_version' from source: facts 28011 1726882564.89897: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882564.90038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882564.91814: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882564.91862: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882564.91892: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882564.91920: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882564.91941: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882564.92000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882564.92023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882564.92042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882564.92068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882564.92079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882564.92120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882564.92142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882564.92156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882564.92180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882564.92191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882564.92301: variable '__network_required_facts' from source: role '' defaults 28011 1726882564.92311: variable 'ansible_facts' from source: unknown 28011 1726882564.92765: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28011 1726882564.92769: when evaluation is False, skipping this task 28011 1726882564.92772: _execute() done 28011 1726882564.92775: dumping result to json 28011 1726882564.92777: done dumping result, returning 28011 1726882564.92783: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-962d-7c65-0000000007cf] 28011 1726882564.92791: sending task result for task 12673a56-9f93-962d-7c65-0000000007cf 28011 1726882564.92876: done sending task result for task 12673a56-9f93-962d-7c65-0000000007cf 28011 1726882564.92879: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882564.92940: no more pending results, returning what we have 28011 1726882564.92944: results queue empty 28011 1726882564.92945: checking for any_errors_fatal 28011 1726882564.92946: done checking for any_errors_fatal 28011 1726882564.92947: checking for max_fail_percentage 28011 1726882564.92949: done checking for max_fail_percentage 28011 1726882564.92950: checking to see if all hosts have failed and the running result is not ok 28011 1726882564.92950: done checking to see if all hosts have failed 28011 1726882564.92951: getting the remaining hosts for this loop 28011 1726882564.92952: done getting the remaining hosts for this loop 28011 1726882564.92956: getting the next task for host managed_node1 28011 1726882564.92972: done getting next task for host managed_node1 28011 1726882564.92976: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28011 1726882564.92979: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882564.92996: getting variables 28011 1726882564.92998: in VariableManager get_vars() 28011 1726882564.93035: Calling all_inventory to load vars for managed_node1 28011 1726882564.93037: Calling groups_inventory to load vars for managed_node1 28011 1726882564.93039: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882564.93048: Calling all_plugins_play to load vars for managed_node1 28011 1726882564.93051: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882564.93053: Calling groups_plugins_play to load vars for managed_node1 28011 1726882564.93890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882564.94789: done with get_vars() 28011 1726882564.94806: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:04 -0400 (0:00:00.059) 0:00:34.500 ****** 28011 1726882564.94873: entering _queue_task() for managed_node1/stat 28011 1726882564.95100: worker is 1 (out of 1 available) 28011 1726882564.95113: exiting _queue_task() for managed_node1/stat 28011 1726882564.95124: done queuing things up, now waiting for results queue to drain 28011 1726882564.95125: waiting for pending results... 28011 1726882564.95292: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 28011 1726882564.95380: in run() - task 12673a56-9f93-962d-7c65-0000000007d1 28011 1726882564.95392: variable 'ansible_search_path' from source: unknown 28011 1726882564.95398: variable 'ansible_search_path' from source: unknown 28011 1726882564.95425: calling self._execute() 28011 1726882564.95498: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882564.95510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882564.95519: variable 'omit' from source: magic vars 28011 1726882564.95775: variable 'ansible_distribution_major_version' from source: facts 28011 1726882564.95784: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882564.95896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882564.96078: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882564.96110: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882564.96137: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882564.96162: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882564.96250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882564.96268: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882564.96285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882564.96305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882564.96367: variable '__network_is_ostree' from source: set_fact 28011 1726882564.96372: Evaluated conditional (not __network_is_ostree is defined): False 28011 1726882564.96376: when evaluation is False, skipping this task 28011 1726882564.96378: _execute() done 28011 1726882564.96381: dumping result to json 28011 1726882564.96384: done dumping result, returning 28011 1726882564.96391: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-962d-7c65-0000000007d1] 28011 1726882564.96396: sending task result for task 12673a56-9f93-962d-7c65-0000000007d1 28011 1726882564.96477: done sending task result for task 12673a56-9f93-962d-7c65-0000000007d1 28011 1726882564.96480: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28011 1726882564.96531: no more pending results, returning what we have 28011 1726882564.96534: results queue empty 28011 1726882564.96535: checking for any_errors_fatal 28011 1726882564.96540: done checking for any_errors_fatal 28011 1726882564.96540: checking for max_fail_percentage 28011 1726882564.96542: done checking for max_fail_percentage 28011 1726882564.96543: checking to see if all hosts have failed and the running result is not ok 28011 1726882564.96544: done checking to see if all hosts have failed 28011 1726882564.96544: getting the remaining hosts for this loop 28011 1726882564.96546: done getting the remaining hosts for this loop 28011 1726882564.96549: getting the next task for host managed_node1 28011 1726882564.96553: done getting next task for host managed_node1 28011 1726882564.96557: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28011 1726882564.96559: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882564.96571: getting variables 28011 1726882564.96572: in VariableManager get_vars() 28011 1726882564.96607: Calling all_inventory to load vars for managed_node1 28011 1726882564.96609: Calling groups_inventory to load vars for managed_node1 28011 1726882564.96611: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882564.96619: Calling all_plugins_play to load vars for managed_node1 28011 1726882564.96621: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882564.96624: Calling groups_plugins_play to load vars for managed_node1 28011 1726882564.97471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882564.98373: done with get_vars() 28011 1726882564.98389: done getting variables 28011 1726882564.98431: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:04 -0400 (0:00:00.035) 0:00:34.535 ****** 28011 1726882564.98453: entering _queue_task() for managed_node1/set_fact 28011 1726882564.98657: worker is 1 (out of 1 available) 28011 1726882564.98670: exiting _queue_task() for managed_node1/set_fact 28011 1726882564.98682: done queuing things up, now waiting for results queue to drain 28011 1726882564.98683: waiting for pending results... 28011 1726882564.98856: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28011 1726882564.98947: in run() - task 12673a56-9f93-962d-7c65-0000000007d2 28011 1726882564.98958: variable 'ansible_search_path' from source: unknown 28011 1726882564.98961: variable 'ansible_search_path' from source: unknown 28011 1726882564.98988: calling self._execute() 28011 1726882564.99063: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882564.99068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882564.99077: variable 'omit' from source: magic vars 28011 1726882564.99358: variable 'ansible_distribution_major_version' from source: facts 28011 1726882564.99366: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882564.99479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882564.99666: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882564.99703: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882564.99727: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882564.99751: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882564.99843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882564.99884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882565.00099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882565.00103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882565.00106: variable '__network_is_ostree' from source: set_fact 28011 1726882565.00108: Evaluated conditional (not __network_is_ostree is defined): False 28011 1726882565.00110: when evaluation is False, skipping this task 28011 1726882565.00112: _execute() done 28011 1726882565.00114: dumping result to json 28011 1726882565.00116: done dumping result, returning 28011 1726882565.00119: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-962d-7c65-0000000007d2] 28011 1726882565.00121: sending task result for task 12673a56-9f93-962d-7c65-0000000007d2 28011 1726882565.00178: done sending task result for task 12673a56-9f93-962d-7c65-0000000007d2 28011 1726882565.00182: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28011 1726882565.00229: no more pending results, returning what we have 28011 1726882565.00233: results queue empty 28011 1726882565.00234: checking for any_errors_fatal 28011 1726882565.00240: done checking for any_errors_fatal 28011 1726882565.00241: checking for max_fail_percentage 28011 1726882565.00243: done checking for max_fail_percentage 28011 1726882565.00244: checking to see if all hosts have failed and the running result is not ok 28011 1726882565.00245: done checking to see if all hosts have failed 28011 1726882565.00246: getting the remaining hosts for this loop 28011 1726882565.00247: done getting the remaining hosts for this loop 28011 1726882565.00251: getting the next task for host managed_node1 28011 1726882565.00262: done getting next task for host managed_node1 28011 1726882565.00399: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28011 1726882565.00403: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882565.00416: getting variables 28011 1726882565.00418: in VariableManager get_vars() 28011 1726882565.00447: Calling all_inventory to load vars for managed_node1 28011 1726882565.00449: Calling groups_inventory to load vars for managed_node1 28011 1726882565.00451: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882565.00459: Calling all_plugins_play to load vars for managed_node1 28011 1726882565.00462: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882565.00465: Calling groups_plugins_play to load vars for managed_node1 28011 1726882565.01602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882565.02609: done with get_vars() 28011 1726882565.02625: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:05 -0400 (0:00:00.042) 0:00:34.578 ****** 28011 1726882565.02691: entering _queue_task() for managed_node1/service_facts 28011 1726882565.02947: worker is 1 (out of 1 available) 28011 1726882565.02961: exiting _queue_task() for managed_node1/service_facts 28011 1726882565.02973: done queuing things up, now waiting for results queue to drain 28011 1726882565.02974: waiting for pending results... 28011 1726882565.03268: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 28011 1726882565.03417: in run() - task 12673a56-9f93-962d-7c65-0000000007d4 28011 1726882565.03435: variable 'ansible_search_path' from source: unknown 28011 1726882565.03442: variable 'ansible_search_path' from source: unknown 28011 1726882565.03598: calling self._execute() 28011 1726882565.03601: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882565.03605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882565.03607: variable 'omit' from source: magic vars 28011 1726882565.03957: variable 'ansible_distribution_major_version' from source: facts 28011 1726882565.03973: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882565.03982: variable 'omit' from source: magic vars 28011 1726882565.04046: variable 'omit' from source: magic vars 28011 1726882565.04085: variable 'omit' from source: magic vars 28011 1726882565.04130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882565.04174: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882565.04204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882565.04225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882565.04240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882565.04277: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882565.04290: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882565.04373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882565.04409: Set connection var ansible_connection to ssh 28011 1726882565.04423: Set connection var ansible_pipelining to False 28011 1726882565.04433: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882565.04443: Set connection var ansible_shell_executable to /bin/sh 28011 1726882565.04454: Set connection var ansible_timeout to 10 28011 1726882565.04462: Set connection var ansible_shell_type to sh 28011 1726882565.04498: variable 'ansible_shell_executable' from source: unknown 28011 1726882565.04506: variable 'ansible_connection' from source: unknown 28011 1726882565.04513: variable 'ansible_module_compression' from source: unknown 28011 1726882565.04519: variable 'ansible_shell_type' from source: unknown 28011 1726882565.04525: variable 'ansible_shell_executable' from source: unknown 28011 1726882565.04531: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882565.04538: variable 'ansible_pipelining' from source: unknown 28011 1726882565.04544: variable 'ansible_timeout' from source: unknown 28011 1726882565.04551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882565.04751: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882565.04767: variable 'omit' from source: magic vars 28011 1726882565.04807: starting attempt loop 28011 1726882565.04810: running the handler 28011 1726882565.04812: _low_level_execute_command(): starting 28011 1726882565.04815: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882565.05484: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882565.05570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882565.05575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882565.05589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882565.05644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882565.07296: stdout chunk (state=3): >>>/root <<< 28011 1726882565.07564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882565.07568: stdout chunk (state=3): >>><<< 28011 1726882565.07570: stderr chunk (state=3): >>><<< 28011 1726882565.07574: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882565.07577: _low_level_execute_command(): starting 28011 1726882565.07579: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350 `" && echo ansible-tmp-1726882565.0749452-29588-57362006052350="` echo /root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350 `" ) && sleep 0' 28011 1726882565.08641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882565.08750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882565.08935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882565.09031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882565.10857: stdout chunk (state=3): >>>ansible-tmp-1726882565.0749452-29588-57362006052350=/root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350 <<< 28011 1726882565.11071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882565.11102: stderr chunk (state=3): >>><<< 28011 1726882565.11398: stdout chunk (state=3): >>><<< 28011 1726882565.11402: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882565.0749452-29588-57362006052350=/root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882565.11404: variable 'ansible_module_compression' from source: unknown 28011 1726882565.11407: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28011 1726882565.11409: variable 'ansible_facts' from source: unknown 28011 1726882565.11818: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350/AnsiballZ_service_facts.py 28011 1726882565.11952: Sending initial data 28011 1726882565.11960: Sent initial data (161 bytes) 28011 1726882565.13268: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882565.13283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882565.13300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882565.13351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882565.13361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882565.13384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882565.13511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882565.15032: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882565.15105: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882565.15209: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpiippnlhp /root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350/AnsiballZ_service_facts.py <<< 28011 1726882565.15212: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350/AnsiballZ_service_facts.py" <<< 28011 1726882565.15276: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpiippnlhp" to remote "/root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350/AnsiballZ_service_facts.py" <<< 28011 1726882565.16531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882565.16592: stderr chunk (state=3): >>><<< 28011 1726882565.16604: stdout chunk (state=3): >>><<< 28011 1726882565.16738: done transferring module to remote 28011 1726882565.16741: _low_level_execute_command(): starting 28011 1726882565.16744: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350/ /root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350/AnsiballZ_service_facts.py && sleep 0' 28011 1726882565.18072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882565.18517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882565.18525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882565.18571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882565.20361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882565.20364: stderr chunk (state=3): >>><<< 28011 1726882565.20366: stdout chunk (state=3): >>><<< 28011 1726882565.20380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882565.20398: _low_level_execute_command(): starting 28011 1726882565.20528: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350/AnsiballZ_service_facts.py && sleep 0' 28011 1726882565.21701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882565.21706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882565.21724: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882565.21727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882565.21996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882565.22000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882566.72832: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 28011 1726882566.72889: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 28011 1726882566.72933: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28011 1726882566.74388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882566.74598: stderr chunk (state=3): >>><<< 28011 1726882566.74602: stdout chunk (state=3): >>><<< 28011 1726882566.74609: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882566.75197: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882566.75207: _low_level_execute_command(): starting 28011 1726882566.75213: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882565.0749452-29588-57362006052350/ > /dev/null 2>&1 && sleep 0' 28011 1726882566.75901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882566.75952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882566.75995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882566.77785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882566.77801: stdout chunk (state=3): >>><<< 28011 1726882566.77818: stderr chunk (state=3): >>><<< 28011 1726882566.77835: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882566.77846: handler run complete 28011 1726882566.78202: variable 'ansible_facts' from source: unknown 28011 1726882566.78213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882566.78813: variable 'ansible_facts' from source: unknown 28011 1726882566.78992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882566.79198: attempt loop complete, returning result 28011 1726882566.79209: _execute() done 28011 1726882566.79216: dumping result to json 28011 1726882566.79276: done dumping result, returning 28011 1726882566.79301: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-962d-7c65-0000000007d4] 28011 1726882566.79311: sending task result for task 12673a56-9f93-962d-7c65-0000000007d4 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882566.80378: no more pending results, returning what we have 28011 1726882566.80380: results queue empty 28011 1726882566.80381: checking for any_errors_fatal 28011 1726882566.80389: done checking for any_errors_fatal 28011 1726882566.80389: checking for max_fail_percentage 28011 1726882566.80391: done checking for max_fail_percentage 28011 1726882566.80392: checking to see if all hosts have failed and the running result is not ok 28011 1726882566.80394: done checking to see if all hosts have failed 28011 1726882566.80395: getting the remaining hosts for this loop 28011 1726882566.80396: done getting the remaining hosts for this loop 28011 1726882566.80399: getting the next task for host managed_node1 28011 1726882566.80404: done getting next task for host managed_node1 28011 1726882566.80408: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28011 1726882566.80410: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882566.80420: getting variables 28011 1726882566.80421: in VariableManager get_vars() 28011 1726882566.80454: Calling all_inventory to load vars for managed_node1 28011 1726882566.80457: Calling groups_inventory to load vars for managed_node1 28011 1726882566.80460: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882566.80470: Calling all_plugins_play to load vars for managed_node1 28011 1726882566.80472: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882566.80475: Calling groups_plugins_play to load vars for managed_node1 28011 1726882566.81009: done sending task result for task 12673a56-9f93-962d-7c65-0000000007d4 28011 1726882566.81012: WORKER PROCESS EXITING 28011 1726882566.82175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882566.83911: done with get_vars() 28011 1726882566.83937: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:06 -0400 (0:00:01.813) 0:00:36.391 ****** 28011 1726882566.84049: entering _queue_task() for managed_node1/package_facts 28011 1726882566.84625: worker is 1 (out of 1 available) 28011 1726882566.84634: exiting _queue_task() for managed_node1/package_facts 28011 1726882566.84644: done queuing things up, now waiting for results queue to drain 28011 1726882566.84646: waiting for pending results... 28011 1726882566.84761: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 28011 1726882566.84917: in run() - task 12673a56-9f93-962d-7c65-0000000007d5 28011 1726882566.84983: variable 'ansible_search_path' from source: unknown 28011 1726882566.84989: variable 'ansible_search_path' from source: unknown 28011 1726882566.85002: calling self._execute() 28011 1726882566.85114: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882566.85125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882566.85399: variable 'omit' from source: magic vars 28011 1726882566.85537: variable 'ansible_distribution_major_version' from source: facts 28011 1726882566.85554: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882566.85566: variable 'omit' from source: magic vars 28011 1726882566.85623: variable 'omit' from source: magic vars 28011 1726882566.85668: variable 'omit' from source: magic vars 28011 1726882566.85719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882566.85765: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882566.85798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882566.85820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882566.85836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882566.85877: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882566.85885: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882566.85898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882566.86006: Set connection var ansible_connection to ssh 28011 1726882566.86021: Set connection var ansible_pipelining to False 28011 1726882566.86032: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882566.86041: Set connection var ansible_shell_executable to /bin/sh 28011 1726882566.86053: Set connection var ansible_timeout to 10 28011 1726882566.86061: Set connection var ansible_shell_type to sh 28011 1726882566.86100: variable 'ansible_shell_executable' from source: unknown 28011 1726882566.86107: variable 'ansible_connection' from source: unknown 28011 1726882566.86114: variable 'ansible_module_compression' from source: unknown 28011 1726882566.86119: variable 'ansible_shell_type' from source: unknown 28011 1726882566.86124: variable 'ansible_shell_executable' from source: unknown 28011 1726882566.86129: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882566.86134: variable 'ansible_pipelining' from source: unknown 28011 1726882566.86140: variable 'ansible_timeout' from source: unknown 28011 1726882566.86146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882566.86351: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882566.86366: variable 'omit' from source: magic vars 28011 1726882566.86399: starting attempt loop 28011 1726882566.86402: running the handler 28011 1726882566.86407: _low_level_execute_command(): starting 28011 1726882566.86420: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882566.87073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882566.87167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882566.87192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882566.87212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882566.87285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882566.88923: stdout chunk (state=3): >>>/root <<< 28011 1726882566.89079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882566.89082: stdout chunk (state=3): >>><<< 28011 1726882566.89085: stderr chunk (state=3): >>><<< 28011 1726882566.89109: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882566.89204: _low_level_execute_command(): starting 28011 1726882566.89207: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002 `" && echo ansible-tmp-1726882566.8911529-29656-68047158814002="` echo /root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002 `" ) && sleep 0' 28011 1726882566.89738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882566.89756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882566.89775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882566.89803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882566.89821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882566.89919: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882566.89950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882566.90026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882566.91890: stdout chunk (state=3): >>>ansible-tmp-1726882566.8911529-29656-68047158814002=/root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002 <<< 28011 1726882566.91994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882566.92021: stderr chunk (state=3): >>><<< 28011 1726882566.92025: stdout chunk (state=3): >>><<< 28011 1726882566.92042: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882566.8911529-29656-68047158814002=/root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882566.92082: variable 'ansible_module_compression' from source: unknown 28011 1726882566.92144: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28011 1726882566.92187: variable 'ansible_facts' from source: unknown 28011 1726882566.92309: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002/AnsiballZ_package_facts.py 28011 1726882566.92416: Sending initial data 28011 1726882566.92420: Sent initial data (161 bytes) 28011 1726882566.92841: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882566.92844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882566.92846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882566.92848: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882566.92850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882566.92896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882566.92900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882566.92951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882566.94465: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28011 1726882566.94473: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882566.94507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882566.94553: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmps3ln69dw /root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002/AnsiballZ_package_facts.py <<< 28011 1726882566.94556: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002/AnsiballZ_package_facts.py" <<< 28011 1726882566.94588: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmps3ln69dw" to remote "/root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002/AnsiballZ_package_facts.py" <<< 28011 1726882566.95612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882566.95649: stderr chunk (state=3): >>><<< 28011 1726882566.95652: stdout chunk (state=3): >>><<< 28011 1726882566.95690: done transferring module to remote 28011 1726882566.95698: _low_level_execute_command(): starting 28011 1726882566.95703: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002/ /root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002/AnsiballZ_package_facts.py && sleep 0' 28011 1726882566.96126: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882566.96129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882566.96131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882566.96133: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882566.96135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882566.96184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882566.96187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882566.96230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882566.97940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882566.97961: stderr chunk (state=3): >>><<< 28011 1726882566.97964: stdout chunk (state=3): >>><<< 28011 1726882566.97976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882566.97979: _low_level_execute_command(): starting 28011 1726882566.97984: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002/AnsiballZ_package_facts.py && sleep 0' 28011 1726882566.98362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882566.98366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882566.98397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882566.98400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882566.98402: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882566.98404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882566.98458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882566.98462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882566.98464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882566.98515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882567.42710: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 28011 1726882567.42738: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 28011 1726882567.42840: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28011 1726882567.44615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882567.44638: stderr chunk (state=3): >>><<< 28011 1726882567.44653: stdout chunk (state=3): >>><<< 28011 1726882567.44736: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882567.47067: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882567.47206: _low_level_execute_command(): starting 28011 1726882567.47215: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882566.8911529-29656-68047158814002/ > /dev/null 2>&1 && sleep 0' 28011 1726882567.48006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882567.48031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882567.48043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882567.48064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882567.48083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882567.48134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882567.48148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882567.48230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882567.50023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882567.50048: stderr chunk (state=3): >>><<< 28011 1726882567.50051: stdout chunk (state=3): >>><<< 28011 1726882567.50071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882567.50077: handler run complete 28011 1726882567.50605: variable 'ansible_facts' from source: unknown 28011 1726882567.50864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882567.52180: variable 'ansible_facts' from source: unknown 28011 1726882567.52559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882567.53256: attempt loop complete, returning result 28011 1726882567.53267: _execute() done 28011 1726882567.53270: dumping result to json 28011 1726882567.53474: done dumping result, returning 28011 1726882567.53500: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-962d-7c65-0000000007d5] 28011 1726882567.53503: sending task result for task 12673a56-9f93-962d-7c65-0000000007d5 28011 1726882567.55878: done sending task result for task 12673a56-9f93-962d-7c65-0000000007d5 28011 1726882567.55882: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882567.56047: no more pending results, returning what we have 28011 1726882567.56049: results queue empty 28011 1726882567.56050: checking for any_errors_fatal 28011 1726882567.56056: done checking for any_errors_fatal 28011 1726882567.56057: checking for max_fail_percentage 28011 1726882567.56059: done checking for max_fail_percentage 28011 1726882567.56059: checking to see if all hosts have failed and the running result is not ok 28011 1726882567.56060: done checking to see if all hosts have failed 28011 1726882567.56061: getting the remaining hosts for this loop 28011 1726882567.56062: done getting the remaining hosts for this loop 28011 1726882567.56065: getting the next task for host managed_node1 28011 1726882567.56071: done getting next task for host managed_node1 28011 1726882567.56075: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28011 1726882567.56077: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882567.56088: getting variables 28011 1726882567.56090: in VariableManager get_vars() 28011 1726882567.56123: Calling all_inventory to load vars for managed_node1 28011 1726882567.56126: Calling groups_inventory to load vars for managed_node1 28011 1726882567.56128: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882567.56137: Calling all_plugins_play to load vars for managed_node1 28011 1726882567.56140: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882567.56143: Calling groups_plugins_play to load vars for managed_node1 28011 1726882567.57377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882567.59000: done with get_vars() 28011 1726882567.59023: done getting variables 28011 1726882567.59082: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:36:07 -0400 (0:00:00.750) 0:00:37.142 ****** 28011 1726882567.59118: entering _queue_task() for managed_node1/debug 28011 1726882567.59464: worker is 1 (out of 1 available) 28011 1726882567.59476: exiting _queue_task() for managed_node1/debug 28011 1726882567.59491: done queuing things up, now waiting for results queue to drain 28011 1726882567.59596: waiting for pending results... 28011 1726882567.59804: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 28011 1726882567.60100: in run() - task 12673a56-9f93-962d-7c65-0000000000d8 28011 1726882567.60105: variable 'ansible_search_path' from source: unknown 28011 1726882567.60108: variable 'ansible_search_path' from source: unknown 28011 1726882567.60111: calling self._execute() 28011 1726882567.60125: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882567.60136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882567.60152: variable 'omit' from source: magic vars 28011 1726882567.60536: variable 'ansible_distribution_major_version' from source: facts 28011 1726882567.60560: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882567.60571: variable 'omit' from source: magic vars 28011 1726882567.60617: variable 'omit' from source: magic vars 28011 1726882567.60726: variable 'network_provider' from source: set_fact 28011 1726882567.60749: variable 'omit' from source: magic vars 28011 1726882567.60804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882567.60843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882567.60869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882567.60902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882567.60918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882567.60954: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882567.60963: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882567.60972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882567.61082: Set connection var ansible_connection to ssh 28011 1726882567.61106: Set connection var ansible_pipelining to False 28011 1726882567.61119: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882567.61206: Set connection var ansible_shell_executable to /bin/sh 28011 1726882567.61209: Set connection var ansible_timeout to 10 28011 1726882567.61212: Set connection var ansible_shell_type to sh 28011 1726882567.61214: variable 'ansible_shell_executable' from source: unknown 28011 1726882567.61217: variable 'ansible_connection' from source: unknown 28011 1726882567.61219: variable 'ansible_module_compression' from source: unknown 28011 1726882567.61221: variable 'ansible_shell_type' from source: unknown 28011 1726882567.61223: variable 'ansible_shell_executable' from source: unknown 28011 1726882567.61225: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882567.61227: variable 'ansible_pipelining' from source: unknown 28011 1726882567.61229: variable 'ansible_timeout' from source: unknown 28011 1726882567.61231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882567.61368: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882567.61392: variable 'omit' from source: magic vars 28011 1726882567.61404: starting attempt loop 28011 1726882567.61411: running the handler 28011 1726882567.61463: handler run complete 28011 1726882567.61482: attempt loop complete, returning result 28011 1726882567.61496: _execute() done 28011 1726882567.61505: dumping result to json 28011 1726882567.61531: done dumping result, returning 28011 1726882567.61534: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-962d-7c65-0000000000d8] 28011 1726882567.61536: sending task result for task 12673a56-9f93-962d-7c65-0000000000d8 ok: [managed_node1] => {} MSG: Using network provider: nm 28011 1726882567.61695: no more pending results, returning what we have 28011 1726882567.61699: results queue empty 28011 1726882567.61700: checking for any_errors_fatal 28011 1726882567.61711: done checking for any_errors_fatal 28011 1726882567.61712: checking for max_fail_percentage 28011 1726882567.61714: done checking for max_fail_percentage 28011 1726882567.61715: checking to see if all hosts have failed and the running result is not ok 28011 1726882567.61716: done checking to see if all hosts have failed 28011 1726882567.61716: getting the remaining hosts for this loop 28011 1726882567.61718: done getting the remaining hosts for this loop 28011 1726882567.61722: getting the next task for host managed_node1 28011 1726882567.61728: done getting next task for host managed_node1 28011 1726882567.61732: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28011 1726882567.61734: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882567.61744: getting variables 28011 1726882567.61746: in VariableManager get_vars() 28011 1726882567.61783: Calling all_inventory to load vars for managed_node1 28011 1726882567.61789: Calling groups_inventory to load vars for managed_node1 28011 1726882567.62095: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882567.62106: Calling all_plugins_play to load vars for managed_node1 28011 1726882567.62109: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882567.62113: Calling groups_plugins_play to load vars for managed_node1 28011 1726882567.62806: done sending task result for task 12673a56-9f93-962d-7c65-0000000000d8 28011 1726882567.62809: WORKER PROCESS EXITING 28011 1726882567.63547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882567.65162: done with get_vars() 28011 1726882567.65192: done getting variables 28011 1726882567.65253: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:36:07 -0400 (0:00:00.061) 0:00:37.204 ****** 28011 1726882567.65289: entering _queue_task() for managed_node1/fail 28011 1726882567.65645: worker is 1 (out of 1 available) 28011 1726882567.65657: exiting _queue_task() for managed_node1/fail 28011 1726882567.65669: done queuing things up, now waiting for results queue to drain 28011 1726882567.65670: waiting for pending results... 28011 1726882567.65956: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28011 1726882567.66082: in run() - task 12673a56-9f93-962d-7c65-0000000000d9 28011 1726882567.66109: variable 'ansible_search_path' from source: unknown 28011 1726882567.66122: variable 'ansible_search_path' from source: unknown 28011 1726882567.66163: calling self._execute() 28011 1726882567.66274: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882567.66289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882567.66310: variable 'omit' from source: magic vars 28011 1726882567.66699: variable 'ansible_distribution_major_version' from source: facts 28011 1726882567.66718: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882567.66847: variable 'network_state' from source: role '' defaults 28011 1726882567.66862: Evaluated conditional (network_state != {}): False 28011 1726882567.66871: when evaluation is False, skipping this task 28011 1726882567.66883: _execute() done 28011 1726882567.66895: dumping result to json 28011 1726882567.66904: done dumping result, returning 28011 1726882567.66916: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-962d-7c65-0000000000d9] 28011 1726882567.66927: sending task result for task 12673a56-9f93-962d-7c65-0000000000d9 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882567.67072: no more pending results, returning what we have 28011 1726882567.67076: results queue empty 28011 1726882567.67077: checking for any_errors_fatal 28011 1726882567.67085: done checking for any_errors_fatal 28011 1726882567.67085: checking for max_fail_percentage 28011 1726882567.67090: done checking for max_fail_percentage 28011 1726882567.67091: checking to see if all hosts have failed and the running result is not ok 28011 1726882567.67092: done checking to see if all hosts have failed 28011 1726882567.67094: getting the remaining hosts for this loop 28011 1726882567.67095: done getting the remaining hosts for this loop 28011 1726882567.67099: getting the next task for host managed_node1 28011 1726882567.67104: done getting next task for host managed_node1 28011 1726882567.67108: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28011 1726882567.67110: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882567.67124: getting variables 28011 1726882567.67126: in VariableManager get_vars() 28011 1726882567.67161: Calling all_inventory to load vars for managed_node1 28011 1726882567.67164: Calling groups_inventory to load vars for managed_node1 28011 1726882567.67166: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882567.67178: Calling all_plugins_play to load vars for managed_node1 28011 1726882567.67181: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882567.67183: Calling groups_plugins_play to load vars for managed_node1 28011 1726882567.67901: done sending task result for task 12673a56-9f93-962d-7c65-0000000000d9 28011 1726882567.67904: WORKER PROCESS EXITING 28011 1726882567.68810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882567.72150: done with get_vars() 28011 1726882567.72179: done getting variables 28011 1726882567.72342: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:36:07 -0400 (0:00:00.070) 0:00:37.275 ****** 28011 1726882567.72375: entering _queue_task() for managed_node1/fail 28011 1726882567.73171: worker is 1 (out of 1 available) 28011 1726882567.73183: exiting _queue_task() for managed_node1/fail 28011 1726882567.73310: done queuing things up, now waiting for results queue to drain 28011 1726882567.73313: waiting for pending results... 28011 1726882567.73674: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28011 1726882567.73901: in run() - task 12673a56-9f93-962d-7c65-0000000000da 28011 1726882567.73923: variable 'ansible_search_path' from source: unknown 28011 1726882567.74298: variable 'ansible_search_path' from source: unknown 28011 1726882567.74303: calling self._execute() 28011 1726882567.74306: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882567.74308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882567.74311: variable 'omit' from source: magic vars 28011 1726882567.74988: variable 'ansible_distribution_major_version' from source: facts 28011 1726882567.75047: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882567.75169: variable 'network_state' from source: role '' defaults 28011 1726882567.75185: Evaluated conditional (network_state != {}): False 28011 1726882567.75202: when evaluation is False, skipping this task 28011 1726882567.75210: _execute() done 28011 1726882567.75220: dumping result to json 28011 1726882567.75228: done dumping result, returning 28011 1726882567.75239: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-962d-7c65-0000000000da] 28011 1726882567.75250: sending task result for task 12673a56-9f93-962d-7c65-0000000000da 28011 1726882567.75360: done sending task result for task 12673a56-9f93-962d-7c65-0000000000da 28011 1726882567.75368: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882567.75424: no more pending results, returning what we have 28011 1726882567.75427: results queue empty 28011 1726882567.75428: checking for any_errors_fatal 28011 1726882567.75434: done checking for any_errors_fatal 28011 1726882567.75435: checking for max_fail_percentage 28011 1726882567.75436: done checking for max_fail_percentage 28011 1726882567.75437: checking to see if all hosts have failed and the running result is not ok 28011 1726882567.75438: done checking to see if all hosts have failed 28011 1726882567.75439: getting the remaining hosts for this loop 28011 1726882567.75440: done getting the remaining hosts for this loop 28011 1726882567.75443: getting the next task for host managed_node1 28011 1726882567.75448: done getting next task for host managed_node1 28011 1726882567.75452: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28011 1726882567.75454: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882567.75469: getting variables 28011 1726882567.75470: in VariableManager get_vars() 28011 1726882567.75716: Calling all_inventory to load vars for managed_node1 28011 1726882567.75719: Calling groups_inventory to load vars for managed_node1 28011 1726882567.75721: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882567.75730: Calling all_plugins_play to load vars for managed_node1 28011 1726882567.75733: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882567.75736: Calling groups_plugins_play to load vars for managed_node1 28011 1726882567.77087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882567.78691: done with get_vars() 28011 1726882567.78719: done getting variables 28011 1726882567.78782: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:36:07 -0400 (0:00:00.064) 0:00:37.339 ****** 28011 1726882567.78815: entering _queue_task() for managed_node1/fail 28011 1726882567.79286: worker is 1 (out of 1 available) 28011 1726882567.79298: exiting _queue_task() for managed_node1/fail 28011 1726882567.79309: done queuing things up, now waiting for results queue to drain 28011 1726882567.79311: waiting for pending results... 28011 1726882567.79501: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28011 1726882567.79625: in run() - task 12673a56-9f93-962d-7c65-0000000000db 28011 1726882567.79649: variable 'ansible_search_path' from source: unknown 28011 1726882567.79657: variable 'ansible_search_path' from source: unknown 28011 1726882567.79700: calling self._execute() 28011 1726882567.79810: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882567.79823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882567.79845: variable 'omit' from source: magic vars 28011 1726882567.80412: variable 'ansible_distribution_major_version' from source: facts 28011 1726882567.80487: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882567.80732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882567.84150: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882567.84282: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882567.84398: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882567.84486: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882567.84584: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882567.84801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882567.84806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882567.84834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882567.85018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882567.85021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882567.85240: variable 'ansible_distribution_major_version' from source: facts 28011 1726882567.85263: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28011 1726882567.85599: variable 'ansible_distribution' from source: facts 28011 1726882567.85671: variable '__network_rh_distros' from source: role '' defaults 28011 1726882567.85674: Evaluated conditional (ansible_distribution in __network_rh_distros): True 28011 1726882567.86131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882567.86253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882567.86288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882567.86375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882567.86398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882567.86498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882567.86579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882567.86616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882567.86916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882567.86919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882567.86922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882567.86924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882567.87004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882567.87053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882567.87116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882567.88052: variable 'network_connections' from source: play vars 28011 1726882567.88069: variable 'profile' from source: play vars 28011 1726882567.88150: variable 'profile' from source: play vars 28011 1726882567.88160: variable 'interface' from source: set_fact 28011 1726882567.88230: variable 'interface' from source: set_fact 28011 1726882567.88245: variable 'network_state' from source: role '' defaults 28011 1726882567.88324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882567.88542: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882567.88583: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882567.88624: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882567.88663: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882567.88748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882567.88845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882567.88851: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882567.88895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882567.88926: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 28011 1726882567.88970: when evaluation is False, skipping this task 28011 1726882567.88973: _execute() done 28011 1726882567.88975: dumping result to json 28011 1726882567.88977: done dumping result, returning 28011 1726882567.88980: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-962d-7c65-0000000000db] 28011 1726882567.88983: sending task result for task 12673a56-9f93-962d-7c65-0000000000db skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 28011 1726882567.89129: no more pending results, returning what we have 28011 1726882567.89133: results queue empty 28011 1726882567.89134: checking for any_errors_fatal 28011 1726882567.89141: done checking for any_errors_fatal 28011 1726882567.89142: checking for max_fail_percentage 28011 1726882567.89144: done checking for max_fail_percentage 28011 1726882567.89145: checking to see if all hosts have failed and the running result is not ok 28011 1726882567.89146: done checking to see if all hosts have failed 28011 1726882567.89147: getting the remaining hosts for this loop 28011 1726882567.89148: done getting the remaining hosts for this loop 28011 1726882567.89152: getting the next task for host managed_node1 28011 1726882567.89158: done getting next task for host managed_node1 28011 1726882567.89162: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28011 1726882567.89163: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882567.89176: getting variables 28011 1726882567.89178: in VariableManager get_vars() 28011 1726882567.89223: Calling all_inventory to load vars for managed_node1 28011 1726882567.89226: Calling groups_inventory to load vars for managed_node1 28011 1726882567.89228: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882567.89240: Calling all_plugins_play to load vars for managed_node1 28011 1726882567.89243: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882567.89246: Calling groups_plugins_play to load vars for managed_node1 28011 1726882567.90020: done sending task result for task 12673a56-9f93-962d-7c65-0000000000db 28011 1726882567.90023: WORKER PROCESS EXITING 28011 1726882567.91830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882567.95219: done with get_vars() 28011 1726882567.95253: done getting variables 28011 1726882567.95428: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:36:07 -0400 (0:00:00.166) 0:00:37.505 ****** 28011 1726882567.95459: entering _queue_task() for managed_node1/dnf 28011 1726882567.96251: worker is 1 (out of 1 available) 28011 1726882567.96265: exiting _queue_task() for managed_node1/dnf 28011 1726882567.96277: done queuing things up, now waiting for results queue to drain 28011 1726882567.96278: waiting for pending results... 28011 1726882567.96875: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28011 1726882567.96975: in run() - task 12673a56-9f93-962d-7c65-0000000000dc 28011 1726882567.96999: variable 'ansible_search_path' from source: unknown 28011 1726882567.97398: variable 'ansible_search_path' from source: unknown 28011 1726882567.97403: calling self._execute() 28011 1726882567.97405: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882567.97409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882567.97411: variable 'omit' from source: magic vars 28011 1726882567.98125: variable 'ansible_distribution_major_version' from source: facts 28011 1726882567.98143: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882567.98542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882568.01984: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882568.02062: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882568.02110: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882568.02148: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882568.02179: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882568.02266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.02305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.02334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.02376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.02402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.02526: variable 'ansible_distribution' from source: facts 28011 1726882568.02536: variable 'ansible_distribution_major_version' from source: facts 28011 1726882568.02554: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28011 1726882568.02672: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882568.02810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.02838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.02866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.02913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.02998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.03001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.03004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.03027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.03069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.03096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.03139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.03167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.03200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.03242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.03260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.03423: variable 'network_connections' from source: play vars 28011 1726882568.03439: variable 'profile' from source: play vars 28011 1726882568.03515: variable 'profile' from source: play vars 28011 1726882568.03601: variable 'interface' from source: set_fact 28011 1726882568.03605: variable 'interface' from source: set_fact 28011 1726882568.03663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882568.03851: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882568.03895: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882568.03931: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882568.03962: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882568.04011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882568.04037: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882568.04074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.04108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882568.04156: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882568.04597: variable 'network_connections' from source: play vars 28011 1726882568.04600: variable 'profile' from source: play vars 28011 1726882568.04603: variable 'profile' from source: play vars 28011 1726882568.04604: variable 'interface' from source: set_fact 28011 1726882568.04606: variable 'interface' from source: set_fact 28011 1726882568.04608: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28011 1726882568.04611: when evaluation is False, skipping this task 28011 1726882568.04613: _execute() done 28011 1726882568.04615: dumping result to json 28011 1726882568.04617: done dumping result, returning 28011 1726882568.04619: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-962d-7c65-0000000000dc] 28011 1726882568.04621: sending task result for task 12673a56-9f93-962d-7c65-0000000000dc skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28011 1726882568.04735: no more pending results, returning what we have 28011 1726882568.04738: results queue empty 28011 1726882568.04739: checking for any_errors_fatal 28011 1726882568.04746: done checking for any_errors_fatal 28011 1726882568.04746: checking for max_fail_percentage 28011 1726882568.04748: done checking for max_fail_percentage 28011 1726882568.04749: checking to see if all hosts have failed and the running result is not ok 28011 1726882568.04749: done checking to see if all hosts have failed 28011 1726882568.04750: getting the remaining hosts for this loop 28011 1726882568.04752: done getting the remaining hosts for this loop 28011 1726882568.04755: getting the next task for host managed_node1 28011 1726882568.04760: done getting next task for host managed_node1 28011 1726882568.04764: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28011 1726882568.04766: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882568.04781: done sending task result for task 12673a56-9f93-962d-7c65-0000000000dc 28011 1726882568.04784: WORKER PROCESS EXITING 28011 1726882568.04794: getting variables 28011 1726882568.04796: in VariableManager get_vars() 28011 1726882568.04837: Calling all_inventory to load vars for managed_node1 28011 1726882568.04839: Calling groups_inventory to load vars for managed_node1 28011 1726882568.04842: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882568.04853: Calling all_plugins_play to load vars for managed_node1 28011 1726882568.04857: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882568.04859: Calling groups_plugins_play to load vars for managed_node1 28011 1726882568.06509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882568.08266: done with get_vars() 28011 1726882568.08527: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28011 1726882568.08603: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:36:08 -0400 (0:00:00.131) 0:00:37.637 ****** 28011 1726882568.08630: entering _queue_task() for managed_node1/yum 28011 1726882568.09291: worker is 1 (out of 1 available) 28011 1726882568.09305: exiting _queue_task() for managed_node1/yum 28011 1726882568.09315: done queuing things up, now waiting for results queue to drain 28011 1726882568.09317: waiting for pending results... 28011 1726882568.09850: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28011 1726882568.10011: in run() - task 12673a56-9f93-962d-7c65-0000000000dd 28011 1726882568.10401: variable 'ansible_search_path' from source: unknown 28011 1726882568.10405: variable 'ansible_search_path' from source: unknown 28011 1726882568.10409: calling self._execute() 28011 1726882568.10509: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882568.10513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882568.10516: variable 'omit' from source: magic vars 28011 1726882568.11156: variable 'ansible_distribution_major_version' from source: facts 28011 1726882568.11389: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882568.11575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882568.16800: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882568.16805: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882568.16808: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882568.16810: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882568.17015: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882568.17096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.17129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.17398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.17401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.17404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.17477: variable 'ansible_distribution_major_version' from source: facts 28011 1726882568.17798: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28011 1726882568.17801: when evaluation is False, skipping this task 28011 1726882568.17804: _execute() done 28011 1726882568.17807: dumping result to json 28011 1726882568.17809: done dumping result, returning 28011 1726882568.17812: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-962d-7c65-0000000000dd] 28011 1726882568.17814: sending task result for task 12673a56-9f93-962d-7c65-0000000000dd 28011 1726882568.17888: done sending task result for task 12673a56-9f93-962d-7c65-0000000000dd 28011 1726882568.17892: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28011 1726882568.17940: no more pending results, returning what we have 28011 1726882568.17944: results queue empty 28011 1726882568.17944: checking for any_errors_fatal 28011 1726882568.17950: done checking for any_errors_fatal 28011 1726882568.17951: checking for max_fail_percentage 28011 1726882568.17952: done checking for max_fail_percentage 28011 1726882568.17953: checking to see if all hosts have failed and the running result is not ok 28011 1726882568.17954: done checking to see if all hosts have failed 28011 1726882568.17955: getting the remaining hosts for this loop 28011 1726882568.17956: done getting the remaining hosts for this loop 28011 1726882568.17959: getting the next task for host managed_node1 28011 1726882568.17965: done getting next task for host managed_node1 28011 1726882568.17968: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28011 1726882568.17970: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882568.17982: getting variables 28011 1726882568.17984: in VariableManager get_vars() 28011 1726882568.18026: Calling all_inventory to load vars for managed_node1 28011 1726882568.18028: Calling groups_inventory to load vars for managed_node1 28011 1726882568.18030: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882568.18041: Calling all_plugins_play to load vars for managed_node1 28011 1726882568.18044: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882568.18046: Calling groups_plugins_play to load vars for managed_node1 28011 1726882568.21166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882568.24416: done with get_vars() 28011 1726882568.24449: done getting variables 28011 1726882568.24720: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:36:08 -0400 (0:00:00.161) 0:00:37.798 ****** 28011 1726882568.24753: entering _queue_task() for managed_node1/fail 28011 1726882568.25530: worker is 1 (out of 1 available) 28011 1726882568.25540: exiting _queue_task() for managed_node1/fail 28011 1726882568.25550: done queuing things up, now waiting for results queue to drain 28011 1726882568.25552: waiting for pending results... 28011 1726882568.25956: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28011 1726882568.26294: in run() - task 12673a56-9f93-962d-7c65-0000000000de 28011 1726882568.26299: variable 'ansible_search_path' from source: unknown 28011 1726882568.26302: variable 'ansible_search_path' from source: unknown 28011 1726882568.26305: calling self._execute() 28011 1726882568.26509: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882568.26523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882568.26540: variable 'omit' from source: magic vars 28011 1726882568.27489: variable 'ansible_distribution_major_version' from source: facts 28011 1726882568.27495: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882568.27650: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882568.28014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882568.32631: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882568.32699: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882568.32939: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882568.32974: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882568.33004: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882568.33078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.33215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.33236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.33271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.33283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.33327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.33347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.33398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.33622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.33625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.33628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.33630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.33647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.33932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.33946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.34132: variable 'network_connections' from source: play vars 28011 1726882568.34145: variable 'profile' from source: play vars 28011 1726882568.34428: variable 'profile' from source: play vars 28011 1726882568.34432: variable 'interface' from source: set_fact 28011 1726882568.34498: variable 'interface' from source: set_fact 28011 1726882568.34567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882568.35298: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882568.35301: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882568.35304: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882568.35306: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882568.35308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882568.35310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882568.35312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.35334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882568.35384: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882568.35641: variable 'network_connections' from source: play vars 28011 1726882568.35655: variable 'profile' from source: play vars 28011 1726882568.35771: variable 'profile' from source: play vars 28011 1726882568.35774: variable 'interface' from source: set_fact 28011 1726882568.35776: variable 'interface' from source: set_fact 28011 1726882568.35802: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28011 1726882568.35805: when evaluation is False, skipping this task 28011 1726882568.35808: _execute() done 28011 1726882568.35811: dumping result to json 28011 1726882568.35815: done dumping result, returning 28011 1726882568.35824: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-962d-7c65-0000000000de] 28011 1726882568.35834: sending task result for task 12673a56-9f93-962d-7c65-0000000000de skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28011 1726882568.36136: no more pending results, returning what we have 28011 1726882568.36139: results queue empty 28011 1726882568.36140: checking for any_errors_fatal 28011 1726882568.36144: done checking for any_errors_fatal 28011 1726882568.36145: checking for max_fail_percentage 28011 1726882568.36146: done checking for max_fail_percentage 28011 1726882568.36147: checking to see if all hosts have failed and the running result is not ok 28011 1726882568.36148: done checking to see if all hosts have failed 28011 1726882568.36149: getting the remaining hosts for this loop 28011 1726882568.36150: done getting the remaining hosts for this loop 28011 1726882568.36153: getting the next task for host managed_node1 28011 1726882568.36158: done getting next task for host managed_node1 28011 1726882568.36161: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28011 1726882568.36163: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882568.36176: getting variables 28011 1726882568.36177: in VariableManager get_vars() 28011 1726882568.36213: Calling all_inventory to load vars for managed_node1 28011 1726882568.36216: Calling groups_inventory to load vars for managed_node1 28011 1726882568.36218: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882568.36226: Calling all_plugins_play to load vars for managed_node1 28011 1726882568.36229: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882568.36231: Calling groups_plugins_play to load vars for managed_node1 28011 1726882568.36812: done sending task result for task 12673a56-9f93-962d-7c65-0000000000de 28011 1726882568.36817: WORKER PROCESS EXITING 28011 1726882568.39118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882568.42433: done with get_vars() 28011 1726882568.42458: done getting variables 28011 1726882568.42526: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:36:08 -0400 (0:00:00.178) 0:00:37.976 ****** 28011 1726882568.42558: entering _queue_task() for managed_node1/package 28011 1726882568.43008: worker is 1 (out of 1 available) 28011 1726882568.43020: exiting _queue_task() for managed_node1/package 28011 1726882568.43035: done queuing things up, now waiting for results queue to drain 28011 1726882568.43037: waiting for pending results... 28011 1726882568.43259: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 28011 1726882568.43389: in run() - task 12673a56-9f93-962d-7c65-0000000000df 28011 1726882568.43416: variable 'ansible_search_path' from source: unknown 28011 1726882568.43429: variable 'ansible_search_path' from source: unknown 28011 1726882568.43479: calling self._execute() 28011 1726882568.43641: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882568.43645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882568.43647: variable 'omit' from source: magic vars 28011 1726882568.44014: variable 'ansible_distribution_major_version' from source: facts 28011 1726882568.44032: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882568.44243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882568.44532: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882568.44584: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882568.44630: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882568.44769: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882568.44798: variable 'network_packages' from source: role '' defaults 28011 1726882568.44924: variable '__network_provider_setup' from source: role '' defaults 28011 1726882568.44938: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882568.45020: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882568.45034: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882568.45147: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882568.45350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882568.47410: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882568.47495: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882568.47537: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882568.47574: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882568.47700: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882568.47708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.47741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.47771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.47828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.47848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.47900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.47938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.47966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.48016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.48198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.48240: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28011 1726882568.48361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.48392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.48430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.48471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.48496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.48600: variable 'ansible_python' from source: facts 28011 1726882568.48633: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28011 1726882568.48735: variable '__network_wpa_supplicant_required' from source: role '' defaults 28011 1726882568.48830: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28011 1726882568.48975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.49009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.49039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.49091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.49114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.49161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.49299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.49302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.49305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.49307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.49452: variable 'network_connections' from source: play vars 28011 1726882568.49464: variable 'profile' from source: play vars 28011 1726882568.49577: variable 'profile' from source: play vars 28011 1726882568.49595: variable 'interface' from source: set_fact 28011 1726882568.49672: variable 'interface' from source: set_fact 28011 1726882568.49755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882568.49790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882568.49829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.49875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882568.49931: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882568.50245: variable 'network_connections' from source: play vars 28011 1726882568.50276: variable 'profile' from source: play vars 28011 1726882568.50383: variable 'profile' from source: play vars 28011 1726882568.50498: variable 'interface' from source: set_fact 28011 1726882568.50501: variable 'interface' from source: set_fact 28011 1726882568.50529: variable '__network_packages_default_wireless' from source: role '' defaults 28011 1726882568.50622: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882568.51004: variable 'network_connections' from source: play vars 28011 1726882568.51018: variable 'profile' from source: play vars 28011 1726882568.51097: variable 'profile' from source: play vars 28011 1726882568.51108: variable 'interface' from source: set_fact 28011 1726882568.51226: variable 'interface' from source: set_fact 28011 1726882568.51263: variable '__network_packages_default_team' from source: role '' defaults 28011 1726882568.51354: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882568.51806: variable 'network_connections' from source: play vars 28011 1726882568.51809: variable 'profile' from source: play vars 28011 1726882568.51899: variable 'profile' from source: play vars 28011 1726882568.51903: variable 'interface' from source: set_fact 28011 1726882568.51956: variable 'interface' from source: set_fact 28011 1726882568.52028: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882568.52097: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882568.52132: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882568.52180: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882568.52425: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28011 1726882568.52969: variable 'network_connections' from source: play vars 28011 1726882568.52999: variable 'profile' from source: play vars 28011 1726882568.53053: variable 'profile' from source: play vars 28011 1726882568.53117: variable 'interface' from source: set_fact 28011 1726882568.53138: variable 'interface' from source: set_fact 28011 1726882568.53151: variable 'ansible_distribution' from source: facts 28011 1726882568.53158: variable '__network_rh_distros' from source: role '' defaults 28011 1726882568.53167: variable 'ansible_distribution_major_version' from source: facts 28011 1726882568.53185: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28011 1726882568.53371: variable 'ansible_distribution' from source: facts 28011 1726882568.53380: variable '__network_rh_distros' from source: role '' defaults 28011 1726882568.53446: variable 'ansible_distribution_major_version' from source: facts 28011 1726882568.53450: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28011 1726882568.53584: variable 'ansible_distribution' from source: facts 28011 1726882568.53599: variable '__network_rh_distros' from source: role '' defaults 28011 1726882568.53610: variable 'ansible_distribution_major_version' from source: facts 28011 1726882568.53651: variable 'network_provider' from source: set_fact 28011 1726882568.53679: variable 'ansible_facts' from source: unknown 28011 1726882568.54458: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28011 1726882568.54466: when evaluation is False, skipping this task 28011 1726882568.54472: _execute() done 28011 1726882568.54478: dumping result to json 28011 1726882568.54499: done dumping result, returning 28011 1726882568.54505: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-962d-7c65-0000000000df] 28011 1726882568.54513: sending task result for task 12673a56-9f93-962d-7c65-0000000000df 28011 1726882568.54671: done sending task result for task 12673a56-9f93-962d-7c65-0000000000df 28011 1726882568.54674: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28011 1726882568.54846: no more pending results, returning what we have 28011 1726882568.54851: results queue empty 28011 1726882568.54851: checking for any_errors_fatal 28011 1726882568.54862: done checking for any_errors_fatal 28011 1726882568.54862: checking for max_fail_percentage 28011 1726882568.54864: done checking for max_fail_percentage 28011 1726882568.54865: checking to see if all hosts have failed and the running result is not ok 28011 1726882568.54866: done checking to see if all hosts have failed 28011 1726882568.54867: getting the remaining hosts for this loop 28011 1726882568.54868: done getting the remaining hosts for this loop 28011 1726882568.54872: getting the next task for host managed_node1 28011 1726882568.54878: done getting next task for host managed_node1 28011 1726882568.54882: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28011 1726882568.54884: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882568.54908: getting variables 28011 1726882568.54911: in VariableManager get_vars() 28011 1726882568.54952: Calling all_inventory to load vars for managed_node1 28011 1726882568.54955: Calling groups_inventory to load vars for managed_node1 28011 1726882568.54958: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882568.54975: Calling all_plugins_play to load vars for managed_node1 28011 1726882568.54979: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882568.54982: Calling groups_plugins_play to load vars for managed_node1 28011 1726882568.56630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882568.58305: done with get_vars() 28011 1726882568.58331: done getting variables 28011 1726882568.58406: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:36:08 -0400 (0:00:00.158) 0:00:38.135 ****** 28011 1726882568.58438: entering _queue_task() for managed_node1/package 28011 1726882568.58917: worker is 1 (out of 1 available) 28011 1726882568.58928: exiting _queue_task() for managed_node1/package 28011 1726882568.58941: done queuing things up, now waiting for results queue to drain 28011 1726882568.58942: waiting for pending results... 28011 1726882568.59292: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28011 1726882568.59300: in run() - task 12673a56-9f93-962d-7c65-0000000000e0 28011 1726882568.59303: variable 'ansible_search_path' from source: unknown 28011 1726882568.59305: variable 'ansible_search_path' from source: unknown 28011 1726882568.59340: calling self._execute() 28011 1726882568.59453: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882568.59465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882568.59481: variable 'omit' from source: magic vars 28011 1726882568.59890: variable 'ansible_distribution_major_version' from source: facts 28011 1726882568.59926: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882568.60049: variable 'network_state' from source: role '' defaults 28011 1726882568.60095: Evaluated conditional (network_state != {}): False 28011 1726882568.60102: when evaluation is False, skipping this task 28011 1726882568.60105: _execute() done 28011 1726882568.60108: dumping result to json 28011 1726882568.60110: done dumping result, returning 28011 1726882568.60112: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-962d-7c65-0000000000e0] 28011 1726882568.60119: sending task result for task 12673a56-9f93-962d-7c65-0000000000e0 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882568.60277: no more pending results, returning what we have 28011 1726882568.60282: results queue empty 28011 1726882568.60283: checking for any_errors_fatal 28011 1726882568.60295: done checking for any_errors_fatal 28011 1726882568.60296: checking for max_fail_percentage 28011 1726882568.60298: done checking for max_fail_percentage 28011 1726882568.60299: checking to see if all hosts have failed and the running result is not ok 28011 1726882568.60300: done checking to see if all hosts have failed 28011 1726882568.60300: getting the remaining hosts for this loop 28011 1726882568.60302: done getting the remaining hosts for this loop 28011 1726882568.60305: getting the next task for host managed_node1 28011 1726882568.60311: done getting next task for host managed_node1 28011 1726882568.60316: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28011 1726882568.60318: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882568.60335: getting variables 28011 1726882568.60337: in VariableManager get_vars() 28011 1726882568.60376: Calling all_inventory to load vars for managed_node1 28011 1726882568.60379: Calling groups_inventory to load vars for managed_node1 28011 1726882568.60382: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882568.60604: Calling all_plugins_play to load vars for managed_node1 28011 1726882568.60608: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882568.60613: Calling groups_plugins_play to load vars for managed_node1 28011 1726882568.61207: done sending task result for task 12673a56-9f93-962d-7c65-0000000000e0 28011 1726882568.61210: WORKER PROCESS EXITING 28011 1726882568.62249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882568.68360: done with get_vars() 28011 1726882568.68383: done getting variables 28011 1726882568.68439: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:36:08 -0400 (0:00:00.100) 0:00:38.235 ****** 28011 1726882568.68463: entering _queue_task() for managed_node1/package 28011 1726882568.68904: worker is 1 (out of 1 available) 28011 1726882568.68917: exiting _queue_task() for managed_node1/package 28011 1726882568.68928: done queuing things up, now waiting for results queue to drain 28011 1726882568.68930: waiting for pending results... 28011 1726882568.69156: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28011 1726882568.69282: in run() - task 12673a56-9f93-962d-7c65-0000000000e1 28011 1726882568.69308: variable 'ansible_search_path' from source: unknown 28011 1726882568.69319: variable 'ansible_search_path' from source: unknown 28011 1726882568.69357: calling self._execute() 28011 1726882568.69458: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882568.69468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882568.69488: variable 'omit' from source: magic vars 28011 1726882568.70554: variable 'ansible_distribution_major_version' from source: facts 28011 1726882568.70558: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882568.70599: variable 'network_state' from source: role '' defaults 28011 1726882568.70614: Evaluated conditional (network_state != {}): False 28011 1726882568.70620: when evaluation is False, skipping this task 28011 1726882568.70627: _execute() done 28011 1726882568.70666: dumping result to json 28011 1726882568.70674: done dumping result, returning 28011 1726882568.70685: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-962d-7c65-0000000000e1] 28011 1726882568.70700: sending task result for task 12673a56-9f93-962d-7c65-0000000000e1 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882568.70852: no more pending results, returning what we have 28011 1726882568.70856: results queue empty 28011 1726882568.70857: checking for any_errors_fatal 28011 1726882568.70865: done checking for any_errors_fatal 28011 1726882568.70866: checking for max_fail_percentage 28011 1726882568.70869: done checking for max_fail_percentage 28011 1726882568.70870: checking to see if all hosts have failed and the running result is not ok 28011 1726882568.70871: done checking to see if all hosts have failed 28011 1726882568.70873: getting the remaining hosts for this loop 28011 1726882568.70874: done getting the remaining hosts for this loop 28011 1726882568.70878: getting the next task for host managed_node1 28011 1726882568.70884: done getting next task for host managed_node1 28011 1726882568.70890: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28011 1726882568.70892: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882568.70909: getting variables 28011 1726882568.70911: in VariableManager get_vars() 28011 1726882568.70947: Calling all_inventory to load vars for managed_node1 28011 1726882568.70949: Calling groups_inventory to load vars for managed_node1 28011 1726882568.70951: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882568.70962: Calling all_plugins_play to load vars for managed_node1 28011 1726882568.70965: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882568.70969: Calling groups_plugins_play to load vars for managed_node1 28011 1726882568.71868: done sending task result for task 12673a56-9f93-962d-7c65-0000000000e1 28011 1726882568.71873: WORKER PROCESS EXITING 28011 1726882568.74081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882568.77910: done with get_vars() 28011 1726882568.77941: done getting variables 28011 1726882568.78151: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:36:08 -0400 (0:00:00.097) 0:00:38.333 ****** 28011 1726882568.78183: entering _queue_task() for managed_node1/service 28011 1726882568.79149: worker is 1 (out of 1 available) 28011 1726882568.79160: exiting _queue_task() for managed_node1/service 28011 1726882568.79173: done queuing things up, now waiting for results queue to drain 28011 1726882568.79174: waiting for pending results... 28011 1726882568.79805: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28011 1726882568.80363: in run() - task 12673a56-9f93-962d-7c65-0000000000e2 28011 1726882568.80367: variable 'ansible_search_path' from source: unknown 28011 1726882568.80369: variable 'ansible_search_path' from source: unknown 28011 1726882568.80371: calling self._execute() 28011 1726882568.80643: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882568.81002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882568.81005: variable 'omit' from source: magic vars 28011 1726882568.81620: variable 'ansible_distribution_major_version' from source: facts 28011 1726882568.81715: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882568.81979: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882568.82405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882568.86834: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882568.87659: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882568.87704: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882568.88198: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882568.88202: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882568.88232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.88265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.88526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.88563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.88576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.88623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.88644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.88667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.88705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.88718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.88756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882568.88777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882568.89002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.89040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882568.89054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882568.89424: variable 'network_connections' from source: play vars 28011 1726882568.89436: variable 'profile' from source: play vars 28011 1726882568.89510: variable 'profile' from source: play vars 28011 1726882568.89514: variable 'interface' from source: set_fact 28011 1726882568.89572: variable 'interface' from source: set_fact 28011 1726882568.89847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882568.90209: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882568.90241: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882568.90269: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882568.90309: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882568.90348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882568.90368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882568.90392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882568.90646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882568.90666: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882568.91099: variable 'network_connections' from source: play vars 28011 1726882568.91103: variable 'profile' from source: play vars 28011 1726882568.91191: variable 'profile' from source: play vars 28011 1726882568.91196: variable 'interface' from source: set_fact 28011 1726882568.91421: variable 'interface' from source: set_fact 28011 1726882568.91444: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28011 1726882568.91447: when evaluation is False, skipping this task 28011 1726882568.91450: _execute() done 28011 1726882568.91453: dumping result to json 28011 1726882568.91455: done dumping result, returning 28011 1726882568.91497: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-962d-7c65-0000000000e2] 28011 1726882568.91517: sending task result for task 12673a56-9f93-962d-7c65-0000000000e2 28011 1726882568.91579: done sending task result for task 12673a56-9f93-962d-7c65-0000000000e2 28011 1726882568.91581: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28011 1726882568.91660: no more pending results, returning what we have 28011 1726882568.91664: results queue empty 28011 1726882568.91664: checking for any_errors_fatal 28011 1726882568.91671: done checking for any_errors_fatal 28011 1726882568.91671: checking for max_fail_percentage 28011 1726882568.91673: done checking for max_fail_percentage 28011 1726882568.91674: checking to see if all hosts have failed and the running result is not ok 28011 1726882568.91675: done checking to see if all hosts have failed 28011 1726882568.91675: getting the remaining hosts for this loop 28011 1726882568.91677: done getting the remaining hosts for this loop 28011 1726882568.91681: getting the next task for host managed_node1 28011 1726882568.91689: done getting next task for host managed_node1 28011 1726882568.91695: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28011 1726882568.91697: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882568.91711: getting variables 28011 1726882568.91713: in VariableManager get_vars() 28011 1726882568.91753: Calling all_inventory to load vars for managed_node1 28011 1726882568.91756: Calling groups_inventory to load vars for managed_node1 28011 1726882568.91758: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882568.91769: Calling all_plugins_play to load vars for managed_node1 28011 1726882568.91772: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882568.91775: Calling groups_plugins_play to load vars for managed_node1 28011 1726882568.95429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882568.98530: done with get_vars() 28011 1726882568.98559: done getting variables 28011 1726882568.98735: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:36:08 -0400 (0:00:00.205) 0:00:38.538 ****** 28011 1726882568.98767: entering _queue_task() for managed_node1/service 28011 1726882568.99577: worker is 1 (out of 1 available) 28011 1726882568.99709: exiting _queue_task() for managed_node1/service 28011 1726882568.99723: done queuing things up, now waiting for results queue to drain 28011 1726882568.99725: waiting for pending results... 28011 1726882569.00184: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28011 1726882569.00499: in run() - task 12673a56-9f93-962d-7c65-0000000000e3 28011 1726882569.00503: variable 'ansible_search_path' from source: unknown 28011 1726882569.00506: variable 'ansible_search_path' from source: unknown 28011 1726882569.00548: calling self._execute() 28011 1726882569.00653: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882569.00659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882569.00789: variable 'omit' from source: magic vars 28011 1726882569.01694: variable 'ansible_distribution_major_version' from source: facts 28011 1726882569.01699: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882569.02000: variable 'network_provider' from source: set_fact 28011 1726882569.02004: variable 'network_state' from source: role '' defaults 28011 1726882569.02006: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28011 1726882569.02009: variable 'omit' from source: magic vars 28011 1726882569.02040: variable 'omit' from source: magic vars 28011 1726882569.02071: variable 'network_service_name' from source: role '' defaults 28011 1726882569.02365: variable 'network_service_name' from source: role '' defaults 28011 1726882569.02475: variable '__network_provider_setup' from source: role '' defaults 28011 1726882569.02479: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882569.02544: variable '__network_service_name_default_nm' from source: role '' defaults 28011 1726882569.02548: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882569.02725: variable '__network_packages_default_nm' from source: role '' defaults 28011 1726882569.03248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882569.07302: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882569.07308: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882569.07311: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882569.07313: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882569.07315: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882569.07349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882569.07376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882569.07401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882569.07599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882569.07602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882569.07605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882569.07607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882569.07609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882569.07612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882569.07615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882569.07808: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28011 1726882569.07919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882569.07943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882569.07968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882569.08004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882569.08019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882569.08101: variable 'ansible_python' from source: facts 28011 1726882569.08122: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28011 1726882569.08391: variable '__network_wpa_supplicant_required' from source: role '' defaults 28011 1726882569.08396: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28011 1726882569.08399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882569.08412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882569.08437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882569.08473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882569.08489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882569.08532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882569.08554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882569.08699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882569.08702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882569.08705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882569.08753: variable 'network_connections' from source: play vars 28011 1726882569.08760: variable 'profile' from source: play vars 28011 1726882569.08898: variable 'profile' from source: play vars 28011 1726882569.08902: variable 'interface' from source: set_fact 28011 1726882569.08904: variable 'interface' from source: set_fact 28011 1726882569.08992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882569.09167: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882569.09217: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882569.09260: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882569.09299: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882569.09556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882569.09589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882569.09617: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882569.09648: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882569.09696: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882569.10369: variable 'network_connections' from source: play vars 28011 1726882569.10375: variable 'profile' from source: play vars 28011 1726882569.10449: variable 'profile' from source: play vars 28011 1726882569.10456: variable 'interface' from source: set_fact 28011 1726882569.10553: variable 'interface' from source: set_fact 28011 1726882569.10616: variable '__network_packages_default_wireless' from source: role '' defaults 28011 1726882569.10692: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882569.10991: variable 'network_connections' from source: play vars 28011 1726882569.11000: variable 'profile' from source: play vars 28011 1726882569.11063: variable 'profile' from source: play vars 28011 1726882569.11066: variable 'interface' from source: set_fact 28011 1726882569.11138: variable 'interface' from source: set_fact 28011 1726882569.11162: variable '__network_packages_default_team' from source: role '' defaults 28011 1726882569.11328: variable '__network_team_connections_defined' from source: role '' defaults 28011 1726882569.11515: variable 'network_connections' from source: play vars 28011 1726882569.11518: variable 'profile' from source: play vars 28011 1726882569.11578: variable 'profile' from source: play vars 28011 1726882569.11581: variable 'interface' from source: set_fact 28011 1726882569.11644: variable 'interface' from source: set_fact 28011 1726882569.11692: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882569.11744: variable '__network_service_name_default_initscripts' from source: role '' defaults 28011 1726882569.11750: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882569.11805: variable '__network_packages_default_initscripts' from source: role '' defaults 28011 1726882569.12052: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28011 1726882569.12535: variable 'network_connections' from source: play vars 28011 1726882569.12540: variable 'profile' from source: play vars 28011 1726882569.12667: variable 'profile' from source: play vars 28011 1726882569.12673: variable 'interface' from source: set_fact 28011 1726882569.12847: variable 'interface' from source: set_fact 28011 1726882569.12852: variable 'ansible_distribution' from source: facts 28011 1726882569.12855: variable '__network_rh_distros' from source: role '' defaults 28011 1726882569.12857: variable 'ansible_distribution_major_version' from source: facts 28011 1726882569.12859: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28011 1726882569.13298: variable 'ansible_distribution' from source: facts 28011 1726882569.13302: variable '__network_rh_distros' from source: role '' defaults 28011 1726882569.13304: variable 'ansible_distribution_major_version' from source: facts 28011 1726882569.13306: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28011 1726882569.13483: variable 'ansible_distribution' from source: facts 28011 1726882569.13489: variable '__network_rh_distros' from source: role '' defaults 28011 1726882569.13492: variable 'ansible_distribution_major_version' from source: facts 28011 1726882569.13534: variable 'network_provider' from source: set_fact 28011 1726882569.13556: variable 'omit' from source: magic vars 28011 1726882569.13583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882569.13774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882569.13800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882569.13813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882569.13823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882569.13864: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882569.13867: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882569.13869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882569.13952: Set connection var ansible_connection to ssh 28011 1726882569.13960: Set connection var ansible_pipelining to False 28011 1726882569.13966: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882569.13971: Set connection var ansible_shell_executable to /bin/sh 28011 1726882569.13980: Set connection var ansible_timeout to 10 28011 1726882569.13989: Set connection var ansible_shell_type to sh 28011 1726882569.14122: variable 'ansible_shell_executable' from source: unknown 28011 1726882569.14126: variable 'ansible_connection' from source: unknown 28011 1726882569.14128: variable 'ansible_module_compression' from source: unknown 28011 1726882569.14130: variable 'ansible_shell_type' from source: unknown 28011 1726882569.14133: variable 'ansible_shell_executable' from source: unknown 28011 1726882569.14135: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882569.14142: variable 'ansible_pipelining' from source: unknown 28011 1726882569.14144: variable 'ansible_timeout' from source: unknown 28011 1726882569.14146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882569.14247: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882569.14256: variable 'omit' from source: magic vars 28011 1726882569.14261: starting attempt loop 28011 1726882569.14263: running the handler 28011 1726882569.14414: variable 'ansible_facts' from source: unknown 28011 1726882569.15154: _low_level_execute_command(): starting 28011 1726882569.15160: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882569.15901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882569.15920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882569.15946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882569.15988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882569.16005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882569.16053: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882569.16112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882569.16131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882569.16181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882569.16228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882569.17922: stdout chunk (state=3): >>>/root <<< 28011 1726882569.18064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882569.18138: stderr chunk (state=3): >>><<< 28011 1726882569.18143: stdout chunk (state=3): >>><<< 28011 1726882569.18254: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882569.18261: _low_level_execute_command(): starting 28011 1726882569.18264: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765 `" && echo ansible-tmp-1726882569.1816225-29742-61926933389765="` echo /root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765 `" ) && sleep 0' 28011 1726882569.19433: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882569.19720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882569.19791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882569.21642: stdout chunk (state=3): >>>ansible-tmp-1726882569.1816225-29742-61926933389765=/root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765 <<< 28011 1726882569.21752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882569.21797: stderr chunk (state=3): >>><<< 28011 1726882569.21909: stdout chunk (state=3): >>><<< 28011 1726882569.21931: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882569.1816225-29742-61926933389765=/root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882569.21963: variable 'ansible_module_compression' from source: unknown 28011 1726882569.22041: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28011 1726882569.22072: variable 'ansible_facts' from source: unknown 28011 1726882569.22473: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765/AnsiballZ_systemd.py 28011 1726882569.22853: Sending initial data 28011 1726882569.22856: Sent initial data (155 bytes) 28011 1726882569.24114: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882569.24117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882569.24136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882569.24142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882569.24175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882569.24178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882569.24330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882569.24333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882569.24506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882569.24512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882569.26216: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28011 1726882569.26220: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 28011 1726882569.26222: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 28011 1726882569.26224: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 28011 1726882569.26226: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 28011 1726882569.26232: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 28011 1726882569.26234: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 28011 1726882569.26236: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 28011 1726882569.26237: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28011 1726882569.26239: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 <<< 28011 1726882569.26241: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" <<< 28011 1726882569.26243: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882569.26405: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882569.26419: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp9kd5a6bf /root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765/AnsiballZ_systemd.py <<< 28011 1726882569.26613: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765/AnsiballZ_systemd.py" <<< 28011 1726882569.26714: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp9kd5a6bf" to remote "/root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765/AnsiballZ_systemd.py" <<< 28011 1726882569.29937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882569.30010: stderr chunk (state=3): >>><<< 28011 1726882569.30131: stdout chunk (state=3): >>><<< 28011 1726882569.30134: done transferring module to remote 28011 1726882569.30137: _low_level_execute_command(): starting 28011 1726882569.30139: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765/ /root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765/AnsiballZ_systemd.py && sleep 0' 28011 1726882569.30885: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882569.30912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882569.30960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882569.30992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882569.31031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882569.32885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882569.32891: stdout chunk (state=3): >>><<< 28011 1726882569.33182: stderr chunk (state=3): >>><<< 28011 1726882569.33190: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882569.33195: _low_level_execute_command(): starting 28011 1726882569.33198: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765/AnsiballZ_systemd.py && sleep 0' 28011 1726882569.33879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882569.33898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882569.33915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882569.33971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882569.33988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882569.34042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882569.62803: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10813440", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3302395904", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1665345000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28011 1726882569.64411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882569.64428: stderr chunk (state=3): >>><<< 28011 1726882569.64431: stdout chunk (state=3): >>><<< 28011 1726882569.64447: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10813440", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3302395904", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1665345000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882569.64599: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882569.64629: _low_level_execute_command(): starting 28011 1726882569.64632: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882569.1816225-29742-61926933389765/ > /dev/null 2>&1 && sleep 0' 28011 1726882569.65515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882569.65603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882569.65629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882569.65702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882569.67462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882569.67497: stderr chunk (state=3): >>><<< 28011 1726882569.67507: stdout chunk (state=3): >>><<< 28011 1726882569.67522: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882569.67533: handler run complete 28011 1726882569.67571: attempt loop complete, returning result 28011 1726882569.67574: _execute() done 28011 1726882569.67576: dumping result to json 28011 1726882569.67590: done dumping result, returning 28011 1726882569.67599: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-962d-7c65-0000000000e3] 28011 1726882569.67602: sending task result for task 12673a56-9f93-962d-7c65-0000000000e3 28011 1726882569.67832: done sending task result for task 12673a56-9f93-962d-7c65-0000000000e3 28011 1726882569.67835: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882569.67881: no more pending results, returning what we have 28011 1726882569.67884: results queue empty 28011 1726882569.67884: checking for any_errors_fatal 28011 1726882569.67896: done checking for any_errors_fatal 28011 1726882569.67897: checking for max_fail_percentage 28011 1726882569.67899: done checking for max_fail_percentage 28011 1726882569.67900: checking to see if all hosts have failed and the running result is not ok 28011 1726882569.67900: done checking to see if all hosts have failed 28011 1726882569.67901: getting the remaining hosts for this loop 28011 1726882569.67902: done getting the remaining hosts for this loop 28011 1726882569.67906: getting the next task for host managed_node1 28011 1726882569.67910: done getting next task for host managed_node1 28011 1726882569.67914: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28011 1726882569.67915: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882569.67924: getting variables 28011 1726882569.67926: in VariableManager get_vars() 28011 1726882569.67963: Calling all_inventory to load vars for managed_node1 28011 1726882569.67965: Calling groups_inventory to load vars for managed_node1 28011 1726882569.67967: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882569.67976: Calling all_plugins_play to load vars for managed_node1 28011 1726882569.67979: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882569.67981: Calling groups_plugins_play to load vars for managed_node1 28011 1726882569.69546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882569.71405: done with get_vars() 28011 1726882569.71428: done getting variables 28011 1726882569.71499: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:09 -0400 (0:00:00.727) 0:00:39.266 ****** 28011 1726882569.71529: entering _queue_task() for managed_node1/service 28011 1726882569.71882: worker is 1 (out of 1 available) 28011 1726882569.72007: exiting _queue_task() for managed_node1/service 28011 1726882569.72019: done queuing things up, now waiting for results queue to drain 28011 1726882569.72020: waiting for pending results... 28011 1726882569.72315: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28011 1726882569.72321: in run() - task 12673a56-9f93-962d-7c65-0000000000e4 28011 1726882569.72412: variable 'ansible_search_path' from source: unknown 28011 1726882569.72417: variable 'ansible_search_path' from source: unknown 28011 1726882569.72420: calling self._execute() 28011 1726882569.72496: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882569.72509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882569.72532: variable 'omit' from source: magic vars 28011 1726882569.72930: variable 'ansible_distribution_major_version' from source: facts 28011 1726882569.72953: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882569.73078: variable 'network_provider' from source: set_fact 28011 1726882569.73089: Evaluated conditional (network_provider == "nm"): True 28011 1726882569.73188: variable '__network_wpa_supplicant_required' from source: role '' defaults 28011 1726882569.73287: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28011 1726882569.73502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882569.75590: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882569.75657: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882569.75705: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882569.75744: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882569.75788: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882569.75876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882569.76002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882569.76005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882569.76010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882569.76012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882569.76054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882569.76082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882569.76116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882569.76164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882569.76216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882569.76227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882569.76255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882569.76280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882569.76327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882569.76350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882569.76543: variable 'network_connections' from source: play vars 28011 1726882569.76546: variable 'profile' from source: play vars 28011 1726882569.76612: variable 'profile' from source: play vars 28011 1726882569.76623: variable 'interface' from source: set_fact 28011 1726882569.76697: variable 'interface' from source: set_fact 28011 1726882569.76779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28011 1726882569.76978: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28011 1726882569.77008: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28011 1726882569.77041: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28011 1726882569.77075: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28011 1726882569.77208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28011 1726882569.77211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28011 1726882569.77214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882569.77216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28011 1726882569.77261: variable '__network_wireless_connections_defined' from source: role '' defaults 28011 1726882569.77550: variable 'network_connections' from source: play vars 28011 1726882569.77560: variable 'profile' from source: play vars 28011 1726882569.77623: variable 'profile' from source: play vars 28011 1726882569.77632: variable 'interface' from source: set_fact 28011 1726882569.77705: variable 'interface' from source: set_fact 28011 1726882569.77740: Evaluated conditional (__network_wpa_supplicant_required): False 28011 1726882569.77753: when evaluation is False, skipping this task 28011 1726882569.77857: _execute() done 28011 1726882569.77871: dumping result to json 28011 1726882569.77873: done dumping result, returning 28011 1726882569.77876: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-962d-7c65-0000000000e4] 28011 1726882569.77878: sending task result for task 12673a56-9f93-962d-7c65-0000000000e4 28011 1726882569.77951: done sending task result for task 12673a56-9f93-962d-7c65-0000000000e4 28011 1726882569.77955: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28011 1726882569.78001: no more pending results, returning what we have 28011 1726882569.78005: results queue empty 28011 1726882569.78005: checking for any_errors_fatal 28011 1726882569.78025: done checking for any_errors_fatal 28011 1726882569.78026: checking for max_fail_percentage 28011 1726882569.78028: done checking for max_fail_percentage 28011 1726882569.78029: checking to see if all hosts have failed and the running result is not ok 28011 1726882569.78029: done checking to see if all hosts have failed 28011 1726882569.78030: getting the remaining hosts for this loop 28011 1726882569.78031: done getting the remaining hosts for this loop 28011 1726882569.78035: getting the next task for host managed_node1 28011 1726882569.78042: done getting next task for host managed_node1 28011 1726882569.78046: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28011 1726882569.78049: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882569.78063: getting variables 28011 1726882569.78065: in VariableManager get_vars() 28011 1726882569.78106: Calling all_inventory to load vars for managed_node1 28011 1726882569.78109: Calling groups_inventory to load vars for managed_node1 28011 1726882569.78112: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882569.78123: Calling all_plugins_play to load vars for managed_node1 28011 1726882569.78126: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882569.78129: Calling groups_plugins_play to load vars for managed_node1 28011 1726882569.79855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882569.81511: done with get_vars() 28011 1726882569.81538: done getting variables 28011 1726882569.81610: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:09 -0400 (0:00:00.101) 0:00:39.367 ****** 28011 1726882569.81642: entering _queue_task() for managed_node1/service 28011 1726882569.82108: worker is 1 (out of 1 available) 28011 1726882569.82121: exiting _queue_task() for managed_node1/service 28011 1726882569.82132: done queuing things up, now waiting for results queue to drain 28011 1726882569.82133: waiting for pending results... 28011 1726882569.82522: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 28011 1726882569.82526: in run() - task 12673a56-9f93-962d-7c65-0000000000e5 28011 1726882569.82529: variable 'ansible_search_path' from source: unknown 28011 1726882569.82532: variable 'ansible_search_path' from source: unknown 28011 1726882569.82535: calling self._execute() 28011 1726882569.82605: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882569.82624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882569.82639: variable 'omit' from source: magic vars 28011 1726882569.83031: variable 'ansible_distribution_major_version' from source: facts 28011 1726882569.83046: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882569.83202: variable 'network_provider' from source: set_fact 28011 1726882569.83206: Evaluated conditional (network_provider == "initscripts"): False 28011 1726882569.83208: when evaluation is False, skipping this task 28011 1726882569.83210: _execute() done 28011 1726882569.83213: dumping result to json 28011 1726882569.83215: done dumping result, returning 28011 1726882569.83217: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-962d-7c65-0000000000e5] 28011 1726882569.83227: sending task result for task 12673a56-9f93-962d-7c65-0000000000e5 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28011 1726882569.83364: no more pending results, returning what we have 28011 1726882569.83368: results queue empty 28011 1726882569.83369: checking for any_errors_fatal 28011 1726882569.83379: done checking for any_errors_fatal 28011 1726882569.83380: checking for max_fail_percentage 28011 1726882569.83382: done checking for max_fail_percentage 28011 1726882569.83383: checking to see if all hosts have failed and the running result is not ok 28011 1726882569.83384: done checking to see if all hosts have failed 28011 1726882569.83384: getting the remaining hosts for this loop 28011 1726882569.83385: done getting the remaining hosts for this loop 28011 1726882569.83389: getting the next task for host managed_node1 28011 1726882569.83398: done getting next task for host managed_node1 28011 1726882569.83401: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28011 1726882569.83404: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882569.83419: getting variables 28011 1726882569.83420: in VariableManager get_vars() 28011 1726882569.83456: Calling all_inventory to load vars for managed_node1 28011 1726882569.83458: Calling groups_inventory to load vars for managed_node1 28011 1726882569.83460: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882569.83473: Calling all_plugins_play to load vars for managed_node1 28011 1726882569.83476: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882569.83479: Calling groups_plugins_play to load vars for managed_node1 28011 1726882569.83999: done sending task result for task 12673a56-9f93-962d-7c65-0000000000e5 28011 1726882569.84002: WORKER PROCESS EXITING 28011 1726882569.84750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882569.87227: done with get_vars() 28011 1726882569.87473: done getting variables 28011 1726882569.87544: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:09 -0400 (0:00:00.059) 0:00:39.427 ****** 28011 1726882569.87576: entering _queue_task() for managed_node1/copy 28011 1726882569.88062: worker is 1 (out of 1 available) 28011 1726882569.88073: exiting _queue_task() for managed_node1/copy 28011 1726882569.88082: done queuing things up, now waiting for results queue to drain 28011 1726882569.88084: waiting for pending results... 28011 1726882569.88330: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28011 1726882569.88448: in run() - task 12673a56-9f93-962d-7c65-0000000000e6 28011 1726882569.88471: variable 'ansible_search_path' from source: unknown 28011 1726882569.88533: variable 'ansible_search_path' from source: unknown 28011 1726882569.88537: calling self._execute() 28011 1726882569.88658: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882569.88669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882569.88714: variable 'omit' from source: magic vars 28011 1726882569.89247: variable 'ansible_distribution_major_version' from source: facts 28011 1726882569.89299: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882569.89406: variable 'network_provider' from source: set_fact 28011 1726882569.89424: Evaluated conditional (network_provider == "initscripts"): False 28011 1726882569.89432: when evaluation is False, skipping this task 28011 1726882569.89461: _execute() done 28011 1726882569.89464: dumping result to json 28011 1726882569.89466: done dumping result, returning 28011 1726882569.89470: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-962d-7c65-0000000000e6] 28011 1726882569.89473: sending task result for task 12673a56-9f93-962d-7c65-0000000000e6 28011 1726882569.89553: done sending task result for task 12673a56-9f93-962d-7c65-0000000000e6 28011 1726882569.89556: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28011 1726882569.89609: no more pending results, returning what we have 28011 1726882569.89616: results queue empty 28011 1726882569.89617: checking for any_errors_fatal 28011 1726882569.89623: done checking for any_errors_fatal 28011 1726882569.89624: checking for max_fail_percentage 28011 1726882569.89625: done checking for max_fail_percentage 28011 1726882569.89626: checking to see if all hosts have failed and the running result is not ok 28011 1726882569.89627: done checking to see if all hosts have failed 28011 1726882569.89628: getting the remaining hosts for this loop 28011 1726882569.89629: done getting the remaining hosts for this loop 28011 1726882569.89632: getting the next task for host managed_node1 28011 1726882569.89638: done getting next task for host managed_node1 28011 1726882569.89641: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28011 1726882569.89644: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882569.89658: getting variables 28011 1726882569.89659: in VariableManager get_vars() 28011 1726882569.89701: Calling all_inventory to load vars for managed_node1 28011 1726882569.89704: Calling groups_inventory to load vars for managed_node1 28011 1726882569.89706: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882569.89715: Calling all_plugins_play to load vars for managed_node1 28011 1726882569.89717: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882569.89724: Calling groups_plugins_play to load vars for managed_node1 28011 1726882569.90517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882569.91830: done with get_vars() 28011 1726882569.91846: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:09 -0400 (0:00:00.043) 0:00:39.470 ****** 28011 1726882569.91908: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 28011 1726882569.92137: worker is 1 (out of 1 available) 28011 1726882569.92151: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 28011 1726882569.92164: done queuing things up, now waiting for results queue to drain 28011 1726882569.92165: waiting for pending results... 28011 1726882569.92332: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28011 1726882569.92408: in run() - task 12673a56-9f93-962d-7c65-0000000000e7 28011 1726882569.92421: variable 'ansible_search_path' from source: unknown 28011 1726882569.92425: variable 'ansible_search_path' from source: unknown 28011 1726882569.92452: calling self._execute() 28011 1726882569.92529: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882569.92533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882569.92542: variable 'omit' from source: magic vars 28011 1726882569.92807: variable 'ansible_distribution_major_version' from source: facts 28011 1726882569.92817: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882569.92822: variable 'omit' from source: magic vars 28011 1726882569.92852: variable 'omit' from source: magic vars 28011 1726882569.92961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28011 1726882569.94598: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28011 1726882569.94641: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28011 1726882569.94666: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28011 1726882569.94697: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28011 1726882569.94717: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28011 1726882569.94771: variable 'network_provider' from source: set_fact 28011 1726882569.94863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28011 1726882569.94882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28011 1726882569.94906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28011 1726882569.94933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28011 1726882569.94943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28011 1726882569.94996: variable 'omit' from source: magic vars 28011 1726882569.95070: variable 'omit' from source: magic vars 28011 1726882569.95142: variable 'network_connections' from source: play vars 28011 1726882569.95151: variable 'profile' from source: play vars 28011 1726882569.95198: variable 'profile' from source: play vars 28011 1726882569.95202: variable 'interface' from source: set_fact 28011 1726882569.95247: variable 'interface' from source: set_fact 28011 1726882569.95343: variable 'omit' from source: magic vars 28011 1726882569.95350: variable '__lsr_ansible_managed' from source: task vars 28011 1726882569.95392: variable '__lsr_ansible_managed' from source: task vars 28011 1726882569.95575: Loaded config def from plugin (lookup/template) 28011 1726882569.95579: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28011 1726882569.95603: File lookup term: get_ansible_managed.j2 28011 1726882569.95607: variable 'ansible_search_path' from source: unknown 28011 1726882569.95610: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28011 1726882569.95622: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28011 1726882569.95635: variable 'ansible_search_path' from source: unknown 28011 1726882569.98964: variable 'ansible_managed' from source: unknown 28011 1726882569.99011: variable 'omit' from source: magic vars 28011 1726882569.99032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882569.99052: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882569.99068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882569.99080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882569.99091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882569.99114: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882569.99117: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882569.99120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882569.99181: Set connection var ansible_connection to ssh 28011 1726882569.99190: Set connection var ansible_pipelining to False 28011 1726882569.99194: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882569.99200: Set connection var ansible_shell_executable to /bin/sh 28011 1726882569.99206: Set connection var ansible_timeout to 10 28011 1726882569.99211: Set connection var ansible_shell_type to sh 28011 1726882569.99230: variable 'ansible_shell_executable' from source: unknown 28011 1726882569.99233: variable 'ansible_connection' from source: unknown 28011 1726882569.99235: variable 'ansible_module_compression' from source: unknown 28011 1726882569.99237: variable 'ansible_shell_type' from source: unknown 28011 1726882569.99240: variable 'ansible_shell_executable' from source: unknown 28011 1726882569.99242: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882569.99245: variable 'ansible_pipelining' from source: unknown 28011 1726882569.99247: variable 'ansible_timeout' from source: unknown 28011 1726882569.99252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882569.99344: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882569.99354: variable 'omit' from source: magic vars 28011 1726882569.99357: starting attempt loop 28011 1726882569.99360: running the handler 28011 1726882569.99371: _low_level_execute_command(): starting 28011 1726882569.99377: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882569.99862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882569.99897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882569.99901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882569.99904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882569.99906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882569.99908: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882569.99967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882569.99969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882569.99971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882570.00019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882570.01688: stdout chunk (state=3): >>>/root <<< 28011 1726882570.01786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882570.01816: stderr chunk (state=3): >>><<< 28011 1726882570.01819: stdout chunk (state=3): >>><<< 28011 1726882570.01835: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882570.01845: _low_level_execute_command(): starting 28011 1726882570.01851: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741 `" && echo ansible-tmp-1726882570.0183575-29786-150428406554741="` echo /root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741 `" ) && sleep 0' 28011 1726882570.02262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.02266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.02268: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.02270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.02321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882570.02328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882570.02369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882570.04251: stdout chunk (state=3): >>>ansible-tmp-1726882570.0183575-29786-150428406554741=/root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741 <<< 28011 1726882570.04398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882570.04416: stdout chunk (state=3): >>><<< 28011 1726882570.04419: stderr chunk (state=3): >>><<< 28011 1726882570.04435: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882570.0183575-29786-150428406554741=/root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882570.04602: variable 'ansible_module_compression' from source: unknown 28011 1726882570.04606: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28011 1726882570.04608: variable 'ansible_facts' from source: unknown 28011 1726882570.04668: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741/AnsiballZ_network_connections.py 28011 1726882570.04834: Sending initial data 28011 1726882570.04843: Sent initial data (168 bytes) 28011 1726882570.05400: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882570.05415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882570.05428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.05480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.05548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882570.05565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882570.05590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882570.05660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882570.07175: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882570.07249: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882570.07339: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpvzjg246h /root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741/AnsiballZ_network_connections.py <<< 28011 1726882570.07358: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741/AnsiballZ_network_connections.py" <<< 28011 1726882570.07397: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpvzjg246h" to remote "/root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741/AnsiballZ_network_connections.py" <<< 28011 1726882570.08662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882570.08717: stderr chunk (state=3): >>><<< 28011 1726882570.08720: stdout chunk (state=3): >>><<< 28011 1726882570.08764: done transferring module to remote 28011 1726882570.08770: _low_level_execute_command(): starting 28011 1726882570.08776: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741/ /root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741/AnsiballZ_network_connections.py && sleep 0' 28011 1726882570.09709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882570.09719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882570.09750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.09753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882570.09756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882570.09758: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882570.09859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.09862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882570.09864: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882570.09866: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28011 1726882570.09868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882570.09870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.09872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882570.09884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.09922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882570.09939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882570.09951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882570.10017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882570.12449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882570.12453: stdout chunk (state=3): >>><<< 28011 1726882570.12455: stderr chunk (state=3): >>><<< 28011 1726882570.12458: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882570.12460: _low_level_execute_command(): starting 28011 1726882570.12462: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741/AnsiballZ_network_connections.py && sleep 0' 28011 1726882570.13766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882570.13770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882570.13772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.13774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882570.13776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882570.13814: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.13871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882570.13989: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882570.13995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882570.14082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882570.40427: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_j0bw4ymn/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_j0bw4ymn/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/40ca51ba-dbc0-41be-afe6-db495ae3e7c1: error=unknown <<< 28011 1726882570.40562: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28011 1726882570.42389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882570.42395: stdout chunk (state=3): >>><<< 28011 1726882570.42400: stderr chunk (state=3): >>><<< 28011 1726882570.42462: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_j0bw4ymn/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_j0bw4ymn/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/40ca51ba-dbc0-41be-afe6-db495ae3e7c1: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882570.42466: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882570.42471: _low_level_execute_command(): starting 28011 1726882570.42473: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882570.0183575-29786-150428406554741/ > /dev/null 2>&1 && sleep 0' 28011 1726882570.43038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882570.43051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882570.43061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.43070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882570.43083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882570.43091: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882570.43101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.43115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28011 1726882570.43159: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882570.43163: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.43211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882570.43222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882570.43260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882570.43297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882570.45128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882570.45150: stderr chunk (state=3): >>><<< 28011 1726882570.45153: stdout chunk (state=3): >>><<< 28011 1726882570.45201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882570.45204: handler run complete 28011 1726882570.45223: attempt loop complete, returning result 28011 1726882570.45226: _execute() done 28011 1726882570.45240: dumping result to json 28011 1726882570.45248: done dumping result, returning 28011 1726882570.45256: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-962d-7c65-0000000000e7] 28011 1726882570.45259: sending task result for task 12673a56-9f93-962d-7c65-0000000000e7 changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 28011 1726882570.45442: no more pending results, returning what we have 28011 1726882570.45445: results queue empty 28011 1726882570.45445: checking for any_errors_fatal 28011 1726882570.45451: done checking for any_errors_fatal 28011 1726882570.45451: checking for max_fail_percentage 28011 1726882570.45453: done checking for max_fail_percentage 28011 1726882570.45454: checking to see if all hosts have failed and the running result is not ok 28011 1726882570.45456: done checking to see if all hosts have failed 28011 1726882570.45457: getting the remaining hosts for this loop 28011 1726882570.45458: done getting the remaining hosts for this loop 28011 1726882570.45462: getting the next task for host managed_node1 28011 1726882570.45466: done getting next task for host managed_node1 28011 1726882570.45470: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28011 1726882570.45472: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882570.45481: getting variables 28011 1726882570.45482: in VariableManager get_vars() 28011 1726882570.45530: Calling all_inventory to load vars for managed_node1 28011 1726882570.45533: Calling groups_inventory to load vars for managed_node1 28011 1726882570.45535: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882570.45541: done sending task result for task 12673a56-9f93-962d-7c65-0000000000e7 28011 1726882570.45544: WORKER PROCESS EXITING 28011 1726882570.45553: Calling all_plugins_play to load vars for managed_node1 28011 1726882570.45556: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882570.45559: Calling groups_plugins_play to load vars for managed_node1 28011 1726882570.46541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882570.47903: done with get_vars() 28011 1726882570.47926: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:36:10 -0400 (0:00:00.560) 0:00:40.031 ****** 28011 1726882570.48010: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 28011 1726882570.48357: worker is 1 (out of 1 available) 28011 1726882570.48371: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 28011 1726882570.48383: done queuing things up, now waiting for results queue to drain 28011 1726882570.48385: waiting for pending results... 28011 1726882570.48673: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 28011 1726882570.48774: in run() - task 12673a56-9f93-962d-7c65-0000000000e8 28011 1726882570.48788: variable 'ansible_search_path' from source: unknown 28011 1726882570.48792: variable 'ansible_search_path' from source: unknown 28011 1726882570.48835: calling self._execute() 28011 1726882570.48939: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882570.48943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882570.48955: variable 'omit' from source: magic vars 28011 1726882570.49278: variable 'ansible_distribution_major_version' from source: facts 28011 1726882570.49291: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882570.49376: variable 'network_state' from source: role '' defaults 28011 1726882570.49385: Evaluated conditional (network_state != {}): False 28011 1726882570.49389: when evaluation is False, skipping this task 28011 1726882570.49392: _execute() done 28011 1726882570.49396: dumping result to json 28011 1726882570.49404: done dumping result, returning 28011 1726882570.49407: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-962d-7c65-0000000000e8] 28011 1726882570.49411: sending task result for task 12673a56-9f93-962d-7c65-0000000000e8 28011 1726882570.49488: done sending task result for task 12673a56-9f93-962d-7c65-0000000000e8 28011 1726882570.49491: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28011 1726882570.49545: no more pending results, returning what we have 28011 1726882570.49549: results queue empty 28011 1726882570.49550: checking for any_errors_fatal 28011 1726882570.49562: done checking for any_errors_fatal 28011 1726882570.49562: checking for max_fail_percentage 28011 1726882570.49565: done checking for max_fail_percentage 28011 1726882570.49566: checking to see if all hosts have failed and the running result is not ok 28011 1726882570.49566: done checking to see if all hosts have failed 28011 1726882570.49567: getting the remaining hosts for this loop 28011 1726882570.49568: done getting the remaining hosts for this loop 28011 1726882570.49572: getting the next task for host managed_node1 28011 1726882570.49578: done getting next task for host managed_node1 28011 1726882570.49581: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28011 1726882570.49584: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882570.49600: getting variables 28011 1726882570.49602: in VariableManager get_vars() 28011 1726882570.49646: Calling all_inventory to load vars for managed_node1 28011 1726882570.49648: Calling groups_inventory to load vars for managed_node1 28011 1726882570.49651: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882570.49660: Calling all_plugins_play to load vars for managed_node1 28011 1726882570.49662: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882570.49665: Calling groups_plugins_play to load vars for managed_node1 28011 1726882570.50470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882570.51829: done with get_vars() 28011 1726882570.51847: done getting variables 28011 1726882570.51891: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:36:10 -0400 (0:00:00.039) 0:00:40.070 ****** 28011 1726882570.51923: entering _queue_task() for managed_node1/debug 28011 1726882570.52167: worker is 1 (out of 1 available) 28011 1726882570.52181: exiting _queue_task() for managed_node1/debug 28011 1726882570.52198: done queuing things up, now waiting for results queue to drain 28011 1726882570.52200: waiting for pending results... 28011 1726882570.52374: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28011 1726882570.52456: in run() - task 12673a56-9f93-962d-7c65-0000000000e9 28011 1726882570.52468: variable 'ansible_search_path' from source: unknown 28011 1726882570.52471: variable 'ansible_search_path' from source: unknown 28011 1726882570.52501: calling self._execute() 28011 1726882570.52574: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882570.52578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882570.52591: variable 'omit' from source: magic vars 28011 1726882570.52870: variable 'ansible_distribution_major_version' from source: facts 28011 1726882570.52880: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882570.52885: variable 'omit' from source: magic vars 28011 1726882570.52920: variable 'omit' from source: magic vars 28011 1726882570.52956: variable 'omit' from source: magic vars 28011 1726882570.52991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882570.53016: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882570.53032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882570.53045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882570.53055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882570.53081: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882570.53084: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882570.53090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882570.53156: Set connection var ansible_connection to ssh 28011 1726882570.53162: Set connection var ansible_pipelining to False 28011 1726882570.53168: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882570.53173: Set connection var ansible_shell_executable to /bin/sh 28011 1726882570.53179: Set connection var ansible_timeout to 10 28011 1726882570.53189: Set connection var ansible_shell_type to sh 28011 1726882570.53206: variable 'ansible_shell_executable' from source: unknown 28011 1726882570.53209: variable 'ansible_connection' from source: unknown 28011 1726882570.53212: variable 'ansible_module_compression' from source: unknown 28011 1726882570.53215: variable 'ansible_shell_type' from source: unknown 28011 1726882570.53218: variable 'ansible_shell_executable' from source: unknown 28011 1726882570.53221: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882570.53225: variable 'ansible_pipelining' from source: unknown 28011 1726882570.53228: variable 'ansible_timeout' from source: unknown 28011 1726882570.53232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882570.53335: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882570.53343: variable 'omit' from source: magic vars 28011 1726882570.53348: starting attempt loop 28011 1726882570.53351: running the handler 28011 1726882570.53444: variable '__network_connections_result' from source: set_fact 28011 1726882570.53481: handler run complete 28011 1726882570.53495: attempt loop complete, returning result 28011 1726882570.53498: _execute() done 28011 1726882570.53502: dumping result to json 28011 1726882570.53504: done dumping result, returning 28011 1726882570.53512: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-962d-7c65-0000000000e9] 28011 1726882570.53515: sending task result for task 12673a56-9f93-962d-7c65-0000000000e9 28011 1726882570.53595: done sending task result for task 12673a56-9f93-962d-7c65-0000000000e9 28011 1726882570.53598: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 28011 1726882570.53680: no more pending results, returning what we have 28011 1726882570.53682: results queue empty 28011 1726882570.53683: checking for any_errors_fatal 28011 1726882570.53690: done checking for any_errors_fatal 28011 1726882570.53691: checking for max_fail_percentage 28011 1726882570.53694: done checking for max_fail_percentage 28011 1726882570.53695: checking to see if all hosts have failed and the running result is not ok 28011 1726882570.53696: done checking to see if all hosts have failed 28011 1726882570.53697: getting the remaining hosts for this loop 28011 1726882570.53698: done getting the remaining hosts for this loop 28011 1726882570.53701: getting the next task for host managed_node1 28011 1726882570.53706: done getting next task for host managed_node1 28011 1726882570.53709: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28011 1726882570.53711: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882570.53720: getting variables 28011 1726882570.53721: in VariableManager get_vars() 28011 1726882570.53753: Calling all_inventory to load vars for managed_node1 28011 1726882570.53755: Calling groups_inventory to load vars for managed_node1 28011 1726882570.53758: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882570.53766: Calling all_plugins_play to load vars for managed_node1 28011 1726882570.53768: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882570.53770: Calling groups_plugins_play to load vars for managed_node1 28011 1726882570.54644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882570.55519: done with get_vars() 28011 1726882570.55536: done getting variables 28011 1726882570.55574: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:36:10 -0400 (0:00:00.036) 0:00:40.107 ****** 28011 1726882570.55597: entering _queue_task() for managed_node1/debug 28011 1726882570.55823: worker is 1 (out of 1 available) 28011 1726882570.55836: exiting _queue_task() for managed_node1/debug 28011 1726882570.55847: done queuing things up, now waiting for results queue to drain 28011 1726882570.55848: waiting for pending results... 28011 1726882570.56032: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28011 1726882570.56107: in run() - task 12673a56-9f93-962d-7c65-0000000000ea 28011 1726882570.56120: variable 'ansible_search_path' from source: unknown 28011 1726882570.56123: variable 'ansible_search_path' from source: unknown 28011 1726882570.56151: calling self._execute() 28011 1726882570.56228: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882570.56232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882570.56242: variable 'omit' from source: magic vars 28011 1726882570.56519: variable 'ansible_distribution_major_version' from source: facts 28011 1726882570.56529: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882570.56533: variable 'omit' from source: magic vars 28011 1726882570.56561: variable 'omit' from source: magic vars 28011 1726882570.56585: variable 'omit' from source: magic vars 28011 1726882570.56621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882570.56647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882570.56663: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882570.56676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882570.56684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882570.56712: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882570.56715: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882570.56718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882570.56787: Set connection var ansible_connection to ssh 28011 1726882570.56797: Set connection var ansible_pipelining to False 28011 1726882570.56803: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882570.56808: Set connection var ansible_shell_executable to /bin/sh 28011 1726882570.56815: Set connection var ansible_timeout to 10 28011 1726882570.56820: Set connection var ansible_shell_type to sh 28011 1726882570.56841: variable 'ansible_shell_executable' from source: unknown 28011 1726882570.56844: variable 'ansible_connection' from source: unknown 28011 1726882570.56847: variable 'ansible_module_compression' from source: unknown 28011 1726882570.56849: variable 'ansible_shell_type' from source: unknown 28011 1726882570.56851: variable 'ansible_shell_executable' from source: unknown 28011 1726882570.56853: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882570.56855: variable 'ansible_pipelining' from source: unknown 28011 1726882570.56857: variable 'ansible_timeout' from source: unknown 28011 1726882570.56862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882570.56962: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882570.56972: variable 'omit' from source: magic vars 28011 1726882570.56976: starting attempt loop 28011 1726882570.56979: running the handler 28011 1726882570.57020: variable '__network_connections_result' from source: set_fact 28011 1726882570.57076: variable '__network_connections_result' from source: set_fact 28011 1726882570.57147: handler run complete 28011 1726882570.57166: attempt loop complete, returning result 28011 1726882570.57170: _execute() done 28011 1726882570.57172: dumping result to json 28011 1726882570.57174: done dumping result, returning 28011 1726882570.57182: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-962d-7c65-0000000000ea] 28011 1726882570.57188: sending task result for task 12673a56-9f93-962d-7c65-0000000000ea 28011 1726882570.57272: done sending task result for task 12673a56-9f93-962d-7c65-0000000000ea 28011 1726882570.57276: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 28011 1726882570.57347: no more pending results, returning what we have 28011 1726882570.57350: results queue empty 28011 1726882570.57351: checking for any_errors_fatal 28011 1726882570.57355: done checking for any_errors_fatal 28011 1726882570.57356: checking for max_fail_percentage 28011 1726882570.57358: done checking for max_fail_percentage 28011 1726882570.57359: checking to see if all hosts have failed and the running result is not ok 28011 1726882570.57359: done checking to see if all hosts have failed 28011 1726882570.57360: getting the remaining hosts for this loop 28011 1726882570.57361: done getting the remaining hosts for this loop 28011 1726882570.57365: getting the next task for host managed_node1 28011 1726882570.57369: done getting next task for host managed_node1 28011 1726882570.57372: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28011 1726882570.57374: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882570.57382: getting variables 28011 1726882570.57385: in VariableManager get_vars() 28011 1726882570.57421: Calling all_inventory to load vars for managed_node1 28011 1726882570.57424: Calling groups_inventory to load vars for managed_node1 28011 1726882570.57426: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882570.57434: Calling all_plugins_play to load vars for managed_node1 28011 1726882570.57436: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882570.57438: Calling groups_plugins_play to load vars for managed_node1 28011 1726882570.58192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882570.59165: done with get_vars() 28011 1726882570.59179: done getting variables 28011 1726882570.59219: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:10 -0400 (0:00:00.036) 0:00:40.143 ****** 28011 1726882570.59244: entering _queue_task() for managed_node1/debug 28011 1726882570.59436: worker is 1 (out of 1 available) 28011 1726882570.59449: exiting _queue_task() for managed_node1/debug 28011 1726882570.59460: done queuing things up, now waiting for results queue to drain 28011 1726882570.59462: waiting for pending results... 28011 1726882570.59630: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28011 1726882570.59685: in run() - task 12673a56-9f93-962d-7c65-0000000000eb 28011 1726882570.59702: variable 'ansible_search_path' from source: unknown 28011 1726882570.59706: variable 'ansible_search_path' from source: unknown 28011 1726882570.59731: calling self._execute() 28011 1726882570.59806: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882570.59810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882570.59819: variable 'omit' from source: magic vars 28011 1726882570.60079: variable 'ansible_distribution_major_version' from source: facts 28011 1726882570.60088: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882570.60173: variable 'network_state' from source: role '' defaults 28011 1726882570.60181: Evaluated conditional (network_state != {}): False 28011 1726882570.60184: when evaluation is False, skipping this task 28011 1726882570.60186: _execute() done 28011 1726882570.60191: dumping result to json 28011 1726882570.60195: done dumping result, returning 28011 1726882570.60204: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-962d-7c65-0000000000eb] 28011 1726882570.60208: sending task result for task 12673a56-9f93-962d-7c65-0000000000eb 28011 1726882570.60286: done sending task result for task 12673a56-9f93-962d-7c65-0000000000eb 28011 1726882570.60289: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 28011 1726882570.60332: no more pending results, returning what we have 28011 1726882570.60335: results queue empty 28011 1726882570.60335: checking for any_errors_fatal 28011 1726882570.60343: done checking for any_errors_fatal 28011 1726882570.60344: checking for max_fail_percentage 28011 1726882570.60346: done checking for max_fail_percentage 28011 1726882570.60347: checking to see if all hosts have failed and the running result is not ok 28011 1726882570.60348: done checking to see if all hosts have failed 28011 1726882570.60348: getting the remaining hosts for this loop 28011 1726882570.60350: done getting the remaining hosts for this loop 28011 1726882570.60352: getting the next task for host managed_node1 28011 1726882570.60357: done getting next task for host managed_node1 28011 1726882570.60360: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28011 1726882570.60362: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882570.60374: getting variables 28011 1726882570.60375: in VariableManager get_vars() 28011 1726882570.60412: Calling all_inventory to load vars for managed_node1 28011 1726882570.60415: Calling groups_inventory to load vars for managed_node1 28011 1726882570.60417: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882570.60425: Calling all_plugins_play to load vars for managed_node1 28011 1726882570.60427: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882570.60430: Calling groups_plugins_play to load vars for managed_node1 28011 1726882570.61173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882570.62055: done with get_vars() 28011 1726882570.62070: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:10 -0400 (0:00:00.028) 0:00:40.172 ****** 28011 1726882570.62134: entering _queue_task() for managed_node1/ping 28011 1726882570.62336: worker is 1 (out of 1 available) 28011 1726882570.62349: exiting _queue_task() for managed_node1/ping 28011 1726882570.62360: done queuing things up, now waiting for results queue to drain 28011 1726882570.62362: waiting for pending results... 28011 1726882570.62531: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 28011 1726882570.62595: in run() - task 12673a56-9f93-962d-7c65-0000000000ec 28011 1726882570.62608: variable 'ansible_search_path' from source: unknown 28011 1726882570.62611: variable 'ansible_search_path' from source: unknown 28011 1726882570.62637: calling self._execute() 28011 1726882570.62713: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882570.62718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882570.62727: variable 'omit' from source: magic vars 28011 1726882570.62998: variable 'ansible_distribution_major_version' from source: facts 28011 1726882570.63008: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882570.63014: variable 'omit' from source: magic vars 28011 1726882570.63044: variable 'omit' from source: magic vars 28011 1726882570.63068: variable 'omit' from source: magic vars 28011 1726882570.63103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882570.63130: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882570.63146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882570.63159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882570.63168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882570.63195: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882570.63198: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882570.63201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882570.63270: Set connection var ansible_connection to ssh 28011 1726882570.63276: Set connection var ansible_pipelining to False 28011 1726882570.63282: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882570.63287: Set connection var ansible_shell_executable to /bin/sh 28011 1726882570.63298: Set connection var ansible_timeout to 10 28011 1726882570.63303: Set connection var ansible_shell_type to sh 28011 1726882570.63319: variable 'ansible_shell_executable' from source: unknown 28011 1726882570.63322: variable 'ansible_connection' from source: unknown 28011 1726882570.63325: variable 'ansible_module_compression' from source: unknown 28011 1726882570.63327: variable 'ansible_shell_type' from source: unknown 28011 1726882570.63330: variable 'ansible_shell_executable' from source: unknown 28011 1726882570.63333: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882570.63337: variable 'ansible_pipelining' from source: unknown 28011 1726882570.63339: variable 'ansible_timeout' from source: unknown 28011 1726882570.63344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882570.63488: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882570.63500: variable 'omit' from source: magic vars 28011 1726882570.63503: starting attempt loop 28011 1726882570.63506: running the handler 28011 1726882570.63520: _low_level_execute_command(): starting 28011 1726882570.63526: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882570.64036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.64040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.64044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.64098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882570.64102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882570.64106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882570.64155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882570.65829: stdout chunk (state=3): >>>/root <<< 28011 1726882570.65923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882570.65950: stderr chunk (state=3): >>><<< 28011 1726882570.65955: stdout chunk (state=3): >>><<< 28011 1726882570.65973: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882570.65984: _low_level_execute_command(): starting 28011 1726882570.65992: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068 `" && echo ansible-tmp-1726882570.6597235-29817-120761543933068="` echo /root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068 `" ) && sleep 0' 28011 1726882570.66430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.66434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.66436: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882570.66445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882570.66447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.66488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882570.66491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882570.66543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882570.68406: stdout chunk (state=3): >>>ansible-tmp-1726882570.6597235-29817-120761543933068=/root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068 <<< 28011 1726882570.68516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882570.68544: stderr chunk (state=3): >>><<< 28011 1726882570.68547: stdout chunk (state=3): >>><<< 28011 1726882570.68561: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882570.6597235-29817-120761543933068=/root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882570.68596: variable 'ansible_module_compression' from source: unknown 28011 1726882570.68625: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28011 1726882570.68656: variable 'ansible_facts' from source: unknown 28011 1726882570.68710: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068/AnsiballZ_ping.py 28011 1726882570.68803: Sending initial data 28011 1726882570.68807: Sent initial data (153 bytes) 28011 1726882570.69236: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.69239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882570.69242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882570.69244: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882570.69246: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.69292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882570.69301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882570.69339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882570.70859: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28011 1726882570.70866: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882570.70904: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882570.70945: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmph_n_hs_h /root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068/AnsiballZ_ping.py <<< 28011 1726882570.70948: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068/AnsiballZ_ping.py" <<< 28011 1726882570.70987: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmph_n_hs_h" to remote "/root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068/AnsiballZ_ping.py" <<< 28011 1726882570.71500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882570.71532: stderr chunk (state=3): >>><<< 28011 1726882570.71536: stdout chunk (state=3): >>><<< 28011 1726882570.71575: done transferring module to remote 28011 1726882570.71583: _low_level_execute_command(): starting 28011 1726882570.71588: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068/ /root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068/AnsiballZ_ping.py && sleep 0' 28011 1726882570.71974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.72018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882570.72021: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882570.72023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.72025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 28011 1726882570.72028: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882570.72033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.72069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882570.72072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882570.72123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882570.73838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882570.73857: stderr chunk (state=3): >>><<< 28011 1726882570.73860: stdout chunk (state=3): >>><<< 28011 1726882570.73871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882570.73874: _low_level_execute_command(): starting 28011 1726882570.73879: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068/AnsiballZ_ping.py && sleep 0' 28011 1726882570.74256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.74297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882570.74301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882570.74303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.74305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882570.74307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.74345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882570.74348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882570.74409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882570.89259: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28011 1726882570.90572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882570.90575: stdout chunk (state=3): >>><<< 28011 1726882570.90578: stderr chunk (state=3): >>><<< 28011 1726882570.90836: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882570.90840: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882570.90843: _low_level_execute_command(): starting 28011 1726882570.90845: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882570.6597235-29817-120761543933068/ > /dev/null 2>&1 && sleep 0' 28011 1726882570.91857: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882570.91876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882570.91897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882570.91983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882570.92117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882570.92213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882570.92290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882570.94258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882570.94377: stdout chunk (state=3): >>><<< 28011 1726882570.94380: stderr chunk (state=3): >>><<< 28011 1726882570.94383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882570.94385: handler run complete 28011 1726882570.94387: attempt loop complete, returning result 28011 1726882570.94389: _execute() done 28011 1726882570.94391: dumping result to json 28011 1726882570.94395: done dumping result, returning 28011 1726882570.94397: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-962d-7c65-0000000000ec] 28011 1726882570.94399: sending task result for task 12673a56-9f93-962d-7c65-0000000000ec 28011 1726882570.94471: done sending task result for task 12673a56-9f93-962d-7c65-0000000000ec ok: [managed_node1] => { "changed": false, "ping": "pong" } 28011 1726882570.94537: no more pending results, returning what we have 28011 1726882570.94541: results queue empty 28011 1726882570.94542: checking for any_errors_fatal 28011 1726882570.94549: done checking for any_errors_fatal 28011 1726882570.94549: checking for max_fail_percentage 28011 1726882570.94551: done checking for max_fail_percentage 28011 1726882570.94552: checking to see if all hosts have failed and the running result is not ok 28011 1726882570.94553: done checking to see if all hosts have failed 28011 1726882570.94554: getting the remaining hosts for this loop 28011 1726882570.94555: done getting the remaining hosts for this loop 28011 1726882570.94559: getting the next task for host managed_node1 28011 1726882570.94566: done getting next task for host managed_node1 28011 1726882570.94569: ^ task is: TASK: meta (role_complete) 28011 1726882570.94571: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882570.94582: getting variables 28011 1726882570.94584: in VariableManager get_vars() 28011 1726882570.94625: Calling all_inventory to load vars for managed_node1 28011 1726882570.94628: Calling groups_inventory to load vars for managed_node1 28011 1726882570.94631: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882570.94644: Calling all_plugins_play to load vars for managed_node1 28011 1726882570.94648: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882570.94651: Calling groups_plugins_play to load vars for managed_node1 28011 1726882570.95309: WORKER PROCESS EXITING 28011 1726882570.96629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882570.98390: done with get_vars() 28011 1726882570.98416: done getting variables 28011 1726882570.98490: done queuing things up, now waiting for results queue to drain 28011 1726882570.98494: results queue empty 28011 1726882570.98495: checking for any_errors_fatal 28011 1726882570.98498: done checking for any_errors_fatal 28011 1726882570.98498: checking for max_fail_percentage 28011 1726882570.98499: done checking for max_fail_percentage 28011 1726882570.98500: checking to see if all hosts have failed and the running result is not ok 28011 1726882570.98501: done checking to see if all hosts have failed 28011 1726882570.98501: getting the remaining hosts for this loop 28011 1726882570.98502: done getting the remaining hosts for this loop 28011 1726882570.98505: getting the next task for host managed_node1 28011 1726882570.98508: done getting next task for host managed_node1 28011 1726882570.98510: ^ task is: TASK: meta (flush_handlers) 28011 1726882570.98511: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882570.98514: getting variables 28011 1726882570.98515: in VariableManager get_vars() 28011 1726882570.98526: Calling all_inventory to load vars for managed_node1 28011 1726882570.98528: Calling groups_inventory to load vars for managed_node1 28011 1726882570.98530: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882570.98534: Calling all_plugins_play to load vars for managed_node1 28011 1726882570.98536: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882570.98539: Calling groups_plugins_play to load vars for managed_node1 28011 1726882570.99669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882571.01215: done with get_vars() 28011 1726882571.01236: done getting variables 28011 1726882571.01287: in VariableManager get_vars() 28011 1726882571.01301: Calling all_inventory to load vars for managed_node1 28011 1726882571.01304: Calling groups_inventory to load vars for managed_node1 28011 1726882571.01306: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882571.01311: Calling all_plugins_play to load vars for managed_node1 28011 1726882571.01314: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882571.01317: Calling groups_plugins_play to load vars for managed_node1 28011 1726882571.02534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882571.04022: done with get_vars() 28011 1726882571.04049: done queuing things up, now waiting for results queue to drain 28011 1726882571.04052: results queue empty 28011 1726882571.04052: checking for any_errors_fatal 28011 1726882571.04054: done checking for any_errors_fatal 28011 1726882571.04055: checking for max_fail_percentage 28011 1726882571.04056: done checking for max_fail_percentage 28011 1726882571.04056: checking to see if all hosts have failed and the running result is not ok 28011 1726882571.04057: done checking to see if all hosts have failed 28011 1726882571.04058: getting the remaining hosts for this loop 28011 1726882571.04059: done getting the remaining hosts for this loop 28011 1726882571.04061: getting the next task for host managed_node1 28011 1726882571.04064: done getting next task for host managed_node1 28011 1726882571.04066: ^ task is: TASK: meta (flush_handlers) 28011 1726882571.04067: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882571.04075: getting variables 28011 1726882571.04076: in VariableManager get_vars() 28011 1726882571.04086: Calling all_inventory to load vars for managed_node1 28011 1726882571.04088: Calling groups_inventory to load vars for managed_node1 28011 1726882571.04090: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882571.04096: Calling all_plugins_play to load vars for managed_node1 28011 1726882571.04099: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882571.04102: Calling groups_plugins_play to load vars for managed_node1 28011 1726882571.05195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882571.06814: done with get_vars() 28011 1726882571.06833: done getting variables 28011 1726882571.06879: in VariableManager get_vars() 28011 1726882571.06890: Calling all_inventory to load vars for managed_node1 28011 1726882571.06894: Calling groups_inventory to load vars for managed_node1 28011 1726882571.06896: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882571.06901: Calling all_plugins_play to load vars for managed_node1 28011 1726882571.06903: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882571.06906: Calling groups_plugins_play to load vars for managed_node1 28011 1726882571.08022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882571.09528: done with get_vars() 28011 1726882571.09553: done queuing things up, now waiting for results queue to drain 28011 1726882571.09555: results queue empty 28011 1726882571.09556: checking for any_errors_fatal 28011 1726882571.09557: done checking for any_errors_fatal 28011 1726882571.09558: checking for max_fail_percentage 28011 1726882571.09559: done checking for max_fail_percentage 28011 1726882571.09560: checking to see if all hosts have failed and the running result is not ok 28011 1726882571.09560: done checking to see if all hosts have failed 28011 1726882571.09561: getting the remaining hosts for this loop 28011 1726882571.09562: done getting the remaining hosts for this loop 28011 1726882571.09565: getting the next task for host managed_node1 28011 1726882571.09568: done getting next task for host managed_node1 28011 1726882571.09568: ^ task is: None 28011 1726882571.09570: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882571.09571: done queuing things up, now waiting for results queue to drain 28011 1726882571.09572: results queue empty 28011 1726882571.09572: checking for any_errors_fatal 28011 1726882571.09573: done checking for any_errors_fatal 28011 1726882571.09574: checking for max_fail_percentage 28011 1726882571.09575: done checking for max_fail_percentage 28011 1726882571.09575: checking to see if all hosts have failed and the running result is not ok 28011 1726882571.09576: done checking to see if all hosts have failed 28011 1726882571.09577: getting the next task for host managed_node1 28011 1726882571.09579: done getting next task for host managed_node1 28011 1726882571.09580: ^ task is: None 28011 1726882571.09581: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882571.09629: in VariableManager get_vars() 28011 1726882571.09643: done with get_vars() 28011 1726882571.09649: in VariableManager get_vars() 28011 1726882571.09657: done with get_vars() 28011 1726882571.09662: variable 'omit' from source: magic vars 28011 1726882571.09695: in VariableManager get_vars() 28011 1726882571.09705: done with get_vars() 28011 1726882571.09727: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 28011 1726882571.09903: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28011 1726882571.09926: getting the remaining hosts for this loop 28011 1726882571.09927: done getting the remaining hosts for this loop 28011 1726882571.09930: getting the next task for host managed_node1 28011 1726882571.09932: done getting next task for host managed_node1 28011 1726882571.09934: ^ task is: TASK: Gathering Facts 28011 1726882571.09936: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882571.09938: getting variables 28011 1726882571.09938: in VariableManager get_vars() 28011 1726882571.09946: Calling all_inventory to load vars for managed_node1 28011 1726882571.09948: Calling groups_inventory to load vars for managed_node1 28011 1726882571.09951: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882571.09956: Calling all_plugins_play to load vars for managed_node1 28011 1726882571.09958: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882571.09961: Calling groups_plugins_play to load vars for managed_node1 28011 1726882571.11155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882571.12682: done with get_vars() 28011 1726882571.12702: done getting variables 28011 1726882571.12742: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:149 Friday 20 September 2024 21:36:11 -0400 (0:00:00.506) 0:00:40.678 ****** 28011 1726882571.12765: entering _queue_task() for managed_node1/gather_facts 28011 1726882571.13074: worker is 1 (out of 1 available) 28011 1726882571.13085: exiting _queue_task() for managed_node1/gather_facts 28011 1726882571.13095: done queuing things up, now waiting for results queue to drain 28011 1726882571.13097: waiting for pending results... 28011 1726882571.13364: running TaskExecutor() for managed_node1/TASK: Gathering Facts 28011 1726882571.13499: in run() - task 12673a56-9f93-962d-7c65-00000000085b 28011 1726882571.13503: variable 'ansible_search_path' from source: unknown 28011 1726882571.13525: calling self._execute() 28011 1726882571.13616: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882571.13698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882571.13702: variable 'omit' from source: magic vars 28011 1726882571.14021: variable 'ansible_distribution_major_version' from source: facts 28011 1726882571.14037: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882571.14046: variable 'omit' from source: magic vars 28011 1726882571.14077: variable 'omit' from source: magic vars 28011 1726882571.14118: variable 'omit' from source: magic vars 28011 1726882571.14158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882571.14203: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882571.14227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882571.14248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882571.14261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882571.14302: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882571.14386: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882571.14389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882571.14429: Set connection var ansible_connection to ssh 28011 1726882571.14443: Set connection var ansible_pipelining to False 28011 1726882571.14454: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882571.14464: Set connection var ansible_shell_executable to /bin/sh 28011 1726882571.14477: Set connection var ansible_timeout to 10 28011 1726882571.14489: Set connection var ansible_shell_type to sh 28011 1726882571.14521: variable 'ansible_shell_executable' from source: unknown 28011 1726882571.14599: variable 'ansible_connection' from source: unknown 28011 1726882571.14602: variable 'ansible_module_compression' from source: unknown 28011 1726882571.14605: variable 'ansible_shell_type' from source: unknown 28011 1726882571.14606: variable 'ansible_shell_executable' from source: unknown 28011 1726882571.14608: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882571.14610: variable 'ansible_pipelining' from source: unknown 28011 1726882571.14611: variable 'ansible_timeout' from source: unknown 28011 1726882571.14613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882571.14729: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882571.14746: variable 'omit' from source: magic vars 28011 1726882571.14758: starting attempt loop 28011 1726882571.14765: running the handler 28011 1726882571.14787: variable 'ansible_facts' from source: unknown 28011 1726882571.14821: _low_level_execute_command(): starting 28011 1726882571.14835: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882571.15703: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882571.15709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882571.15757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882571.15799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882571.17499: stdout chunk (state=3): >>>/root <<< 28011 1726882571.17652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882571.17655: stdout chunk (state=3): >>><<< 28011 1726882571.17661: stderr chunk (state=3): >>><<< 28011 1726882571.17721: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882571.17724: _low_level_execute_command(): starting 28011 1726882571.17729: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276 `" && echo ansible-tmp-1726882571.1768901-29838-120145514751276="` echo /root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276 `" ) && sleep 0' 28011 1726882571.18178: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882571.18210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882571.18214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882571.18224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882571.18266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882571.18270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882571.18317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882571.20191: stdout chunk (state=3): >>>ansible-tmp-1726882571.1768901-29838-120145514751276=/root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276 <<< 28011 1726882571.20292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882571.20321: stderr chunk (state=3): >>><<< 28011 1726882571.20324: stdout chunk (state=3): >>><<< 28011 1726882571.20339: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882571.1768901-29838-120145514751276=/root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882571.20362: variable 'ansible_module_compression' from source: unknown 28011 1726882571.20401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28011 1726882571.20457: variable 'ansible_facts' from source: unknown 28011 1726882571.20589: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276/AnsiballZ_setup.py 28011 1726882571.20681: Sending initial data 28011 1726882571.20685: Sent initial data (154 bytes) 28011 1726882571.21085: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882571.21091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882571.21095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882571.21097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882571.21144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882571.21147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882571.21196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882571.22712: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882571.22755: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882571.22820: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp2rf8_9g4 /root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276/AnsiballZ_setup.py <<< 28011 1726882571.22824: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276/AnsiballZ_setup.py" <<< 28011 1726882571.22867: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp2rf8_9g4" to remote "/root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276/AnsiballZ_setup.py" <<< 28011 1726882571.23959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882571.23966: stderr chunk (state=3): >>><<< 28011 1726882571.23969: stdout chunk (state=3): >>><<< 28011 1726882571.23985: done transferring module to remote 28011 1726882571.23996: _low_level_execute_command(): starting 28011 1726882571.24001: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276/ /root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276/AnsiballZ_setup.py && sleep 0' 28011 1726882571.24394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882571.24398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882571.24401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882571.24403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882571.24449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882571.24461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882571.24502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882571.26227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882571.26255: stderr chunk (state=3): >>><<< 28011 1726882571.26257: stdout chunk (state=3): >>><<< 28011 1726882571.26267: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882571.26298: _low_level_execute_command(): starting 28011 1726882571.26300: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276/AnsiballZ_setup.py && sleep 0' 28011 1726882571.26672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882571.26691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882571.26706: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882571.26752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882571.26755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882571.26811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882571.88980: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.48876953125, "5m": 0.40185546875, "15m": 0.22314453125}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "11", "epoch": "1726882571", "epoch_int": "1726882571", "date": "2024-09-20", "time": "21:36:11", "iso8601_micro": "2024-09-21T01:36:11.542401Z", "iso8601": "2024-09-21T01:36:11Z", "iso8601_basic": "20240920T213611542401", "iso8601_basic_short": "20240920T213611", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansi<<< 28011 1726882571.89025: stdout chunk (state=3): >>>ble_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2942, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 589, "free": 2942}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1004, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789900800, "block_size": 4096, "block_total": 65519099, "block_available": 63913550, "block_used": 1605549, "inode_total": 131070960, "inode_available": 131029045, "inode_used": 41915, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial"<<< 28011 1726882571.89037: stdout chunk (state=3): >>>: "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"]<<< 28011 1726882571.89050: stdout chunk (state=3): >>>, "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28011 1726882571.91203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882571.91207: stdout chunk (state=3): >>><<< 28011 1726882571.91209: stderr chunk (state=3): >>><<< 28011 1726882571.91221: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.48876953125, "5m": 0.40185546875, "15m": 0.22314453125}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "11", "epoch": "1726882571", "epoch_int": "1726882571", "date": "2024-09-20", "time": "21:36:11", "iso8601_micro": "2024-09-21T01:36:11.542401Z", "iso8601": "2024-09-21T01:36:11Z", "iso8601_basic": "20240920T213611542401", "iso8601_basic_short": "20240920T213611", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2942, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 589, "free": 2942}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1004, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789900800, "block_size": 4096, "block_total": 65519099, "block_available": 63913550, "block_used": 1605549, "inode_total": 131070960, "inode_available": 131029045, "inode_used": 41915, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882571.91868: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882571.91924: _low_level_execute_command(): starting 28011 1726882571.91927: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882571.1768901-29838-120145514751276/ > /dev/null 2>&1 && sleep 0' 28011 1726882571.92363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882571.92366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882571.92368: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882571.92370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882571.92372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882571.92420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882571.92427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882571.92472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882571.94302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882571.94305: stdout chunk (state=3): >>><<< 28011 1726882571.94308: stderr chunk (state=3): >>><<< 28011 1726882571.94351: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882571.94354: handler run complete 28011 1726882571.94517: variable 'ansible_facts' from source: unknown 28011 1726882571.94604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882571.95013: variable 'ansible_facts' from source: unknown 28011 1726882571.95041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882571.95182: attempt loop complete, returning result 28011 1726882571.95198: _execute() done 28011 1726882571.95206: dumping result to json 28011 1726882571.95249: done dumping result, returning 28011 1726882571.95261: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-962d-7c65-00000000085b] 28011 1726882571.95270: sending task result for task 12673a56-9f93-962d-7c65-00000000085b ok: [managed_node1] 28011 1726882571.96368: no more pending results, returning what we have 28011 1726882571.96371: results queue empty 28011 1726882571.96372: checking for any_errors_fatal 28011 1726882571.96373: done checking for any_errors_fatal 28011 1726882571.96374: checking for max_fail_percentage 28011 1726882571.96375: done checking for max_fail_percentage 28011 1726882571.96376: checking to see if all hosts have failed and the running result is not ok 28011 1726882571.96377: done checking to see if all hosts have failed 28011 1726882571.96378: getting the remaining hosts for this loop 28011 1726882571.96379: done getting the remaining hosts for this loop 28011 1726882571.96383: getting the next task for host managed_node1 28011 1726882571.96390: done getting next task for host managed_node1 28011 1726882571.96392: ^ task is: TASK: meta (flush_handlers) 28011 1726882571.96396: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882571.96401: getting variables 28011 1726882571.96431: in VariableManager get_vars() 28011 1726882571.96454: Calling all_inventory to load vars for managed_node1 28011 1726882571.96457: Calling groups_inventory to load vars for managed_node1 28011 1726882571.96460: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882571.96466: done sending task result for task 12673a56-9f93-962d-7c65-00000000085b 28011 1726882571.96468: WORKER PROCESS EXITING 28011 1726882571.96478: Calling all_plugins_play to load vars for managed_node1 28011 1726882571.96481: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882571.96484: Calling groups_plugins_play to load vars for managed_node1 28011 1726882571.97997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882572.00368: done with get_vars() 28011 1726882572.00384: done getting variables 28011 1726882572.00445: in VariableManager get_vars() 28011 1726882572.00452: Calling all_inventory to load vars for managed_node1 28011 1726882572.00454: Calling groups_inventory to load vars for managed_node1 28011 1726882572.00456: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882572.00460: Calling all_plugins_play to load vars for managed_node1 28011 1726882572.00461: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882572.00463: Calling groups_plugins_play to load vars for managed_node1 28011 1726882572.04527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882572.05389: done with get_vars() 28011 1726882572.05411: done queuing things up, now waiting for results queue to drain 28011 1726882572.05413: results queue empty 28011 1726882572.05414: checking for any_errors_fatal 28011 1726882572.05416: done checking for any_errors_fatal 28011 1726882572.05416: checking for max_fail_percentage 28011 1726882572.05417: done checking for max_fail_percentage 28011 1726882572.05422: checking to see if all hosts have failed and the running result is not ok 28011 1726882572.05422: done checking to see if all hosts have failed 28011 1726882572.05423: getting the remaining hosts for this loop 28011 1726882572.05423: done getting the remaining hosts for this loop 28011 1726882572.05425: getting the next task for host managed_node1 28011 1726882572.05428: done getting next task for host managed_node1 28011 1726882572.05429: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 28011 1726882572.05430: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882572.05432: getting variables 28011 1726882572.05432: in VariableManager get_vars() 28011 1726882572.05438: Calling all_inventory to load vars for managed_node1 28011 1726882572.05439: Calling groups_inventory to load vars for managed_node1 28011 1726882572.05441: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882572.05445: Calling all_plugins_play to load vars for managed_node1 28011 1726882572.05446: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882572.05448: Calling groups_plugins_play to load vars for managed_node1 28011 1726882572.06096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882572.07005: done with get_vars() 28011 1726882572.07020: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:152 Friday 20 September 2024 21:36:12 -0400 (0:00:00.943) 0:00:41.622 ****** 28011 1726882572.07070: entering _queue_task() for managed_node1/include_tasks 28011 1726882572.07385: worker is 1 (out of 1 available) 28011 1726882572.07399: exiting _queue_task() for managed_node1/include_tasks 28011 1726882572.07411: done queuing things up, now waiting for results queue to drain 28011 1726882572.07412: waiting for pending results... 28011 1726882572.07600: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' 28011 1726882572.07679: in run() - task 12673a56-9f93-962d-7c65-0000000000ef 28011 1726882572.07695: variable 'ansible_search_path' from source: unknown 28011 1726882572.07724: calling self._execute() 28011 1726882572.07802: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882572.07807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882572.07817: variable 'omit' from source: magic vars 28011 1726882572.08121: variable 'ansible_distribution_major_version' from source: facts 28011 1726882572.08131: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882572.08136: _execute() done 28011 1726882572.08139: dumping result to json 28011 1726882572.08142: done dumping result, returning 28011 1726882572.08150: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' [12673a56-9f93-962d-7c65-0000000000ef] 28011 1726882572.08154: sending task result for task 12673a56-9f93-962d-7c65-0000000000ef 28011 1726882572.08247: done sending task result for task 12673a56-9f93-962d-7c65-0000000000ef 28011 1726882572.08249: WORKER PROCESS EXITING 28011 1726882572.08273: no more pending results, returning what we have 28011 1726882572.08278: in VariableManager get_vars() 28011 1726882572.08311: Calling all_inventory to load vars for managed_node1 28011 1726882572.08314: Calling groups_inventory to load vars for managed_node1 28011 1726882572.08317: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882572.08329: Calling all_plugins_play to load vars for managed_node1 28011 1726882572.08332: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882572.08334: Calling groups_plugins_play to load vars for managed_node1 28011 1726882572.09119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882572.09990: done with get_vars() 28011 1726882572.10004: variable 'ansible_search_path' from source: unknown 28011 1726882572.10015: we have included files to process 28011 1726882572.10016: generating all_blocks data 28011 1726882572.10017: done generating all_blocks data 28011 1726882572.10017: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28011 1726882572.10018: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28011 1726882572.10020: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28011 1726882572.10125: in VariableManager get_vars() 28011 1726882572.10138: done with get_vars() 28011 1726882572.10212: done processing included file 28011 1726882572.10213: iterating over new_blocks loaded from include file 28011 1726882572.10214: in VariableManager get_vars() 28011 1726882572.10223: done with get_vars() 28011 1726882572.10224: filtering new block on tags 28011 1726882572.10235: done filtering new block on tags 28011 1726882572.10237: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 28011 1726882572.10241: extending task lists for all hosts with included blocks 28011 1726882572.10296: done extending task lists 28011 1726882572.10297: done processing included files 28011 1726882572.10298: results queue empty 28011 1726882572.10298: checking for any_errors_fatal 28011 1726882572.10300: done checking for any_errors_fatal 28011 1726882572.10300: checking for max_fail_percentage 28011 1726882572.10301: done checking for max_fail_percentage 28011 1726882572.10301: checking to see if all hosts have failed and the running result is not ok 28011 1726882572.10302: done checking to see if all hosts have failed 28011 1726882572.10302: getting the remaining hosts for this loop 28011 1726882572.10303: done getting the remaining hosts for this loop 28011 1726882572.10304: getting the next task for host managed_node1 28011 1726882572.10307: done getting next task for host managed_node1 28011 1726882572.10308: ^ task is: TASK: Include the task 'get_profile_stat.yml' 28011 1726882572.10310: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882572.10311: getting variables 28011 1726882572.10312: in VariableManager get_vars() 28011 1726882572.10317: Calling all_inventory to load vars for managed_node1 28011 1726882572.10319: Calling groups_inventory to load vars for managed_node1 28011 1726882572.10320: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882572.10323: Calling all_plugins_play to load vars for managed_node1 28011 1726882572.10325: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882572.10326: Calling groups_plugins_play to load vars for managed_node1 28011 1726882572.11015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882572.12055: done with get_vars() 28011 1726882572.12074: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:36:12 -0400 (0:00:00.050) 0:00:41.672 ****** 28011 1726882572.12147: entering _queue_task() for managed_node1/include_tasks 28011 1726882572.12435: worker is 1 (out of 1 available) 28011 1726882572.12446: exiting _queue_task() for managed_node1/include_tasks 28011 1726882572.12456: done queuing things up, now waiting for results queue to drain 28011 1726882572.12457: waiting for pending results... 28011 1726882572.12752: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 28011 1726882572.12819: in run() - task 12673a56-9f93-962d-7c65-00000000086c 28011 1726882572.12831: variable 'ansible_search_path' from source: unknown 28011 1726882572.12835: variable 'ansible_search_path' from source: unknown 28011 1726882572.12864: calling self._execute() 28011 1726882572.12948: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882572.12952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882572.12964: variable 'omit' from source: magic vars 28011 1726882572.13242: variable 'ansible_distribution_major_version' from source: facts 28011 1726882572.13251: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882572.13256: _execute() done 28011 1726882572.13259: dumping result to json 28011 1726882572.13263: done dumping result, returning 28011 1726882572.13269: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-962d-7c65-00000000086c] 28011 1726882572.13274: sending task result for task 12673a56-9f93-962d-7c65-00000000086c 28011 1726882572.13352: done sending task result for task 12673a56-9f93-962d-7c65-00000000086c 28011 1726882572.13355: WORKER PROCESS EXITING 28011 1726882572.13378: no more pending results, returning what we have 28011 1726882572.13382: in VariableManager get_vars() 28011 1726882572.13417: Calling all_inventory to load vars for managed_node1 28011 1726882572.13420: Calling groups_inventory to load vars for managed_node1 28011 1726882572.13423: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882572.13434: Calling all_plugins_play to load vars for managed_node1 28011 1726882572.13437: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882572.13440: Calling groups_plugins_play to load vars for managed_node1 28011 1726882572.14222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882572.15471: done with get_vars() 28011 1726882572.15490: variable 'ansible_search_path' from source: unknown 28011 1726882572.15492: variable 'ansible_search_path' from source: unknown 28011 1726882572.15526: we have included files to process 28011 1726882572.15528: generating all_blocks data 28011 1726882572.15529: done generating all_blocks data 28011 1726882572.15530: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28011 1726882572.15531: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28011 1726882572.15533: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28011 1726882572.16342: done processing included file 28011 1726882572.16343: iterating over new_blocks loaded from include file 28011 1726882572.16344: in VariableManager get_vars() 28011 1726882572.16352: done with get_vars() 28011 1726882572.16353: filtering new block on tags 28011 1726882572.16367: done filtering new block on tags 28011 1726882572.16369: in VariableManager get_vars() 28011 1726882572.16375: done with get_vars() 28011 1726882572.16376: filtering new block on tags 28011 1726882572.16390: done filtering new block on tags 28011 1726882572.16391: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 28011 1726882572.16396: extending task lists for all hosts with included blocks 28011 1726882572.16454: done extending task lists 28011 1726882572.16455: done processing included files 28011 1726882572.16455: results queue empty 28011 1726882572.16456: checking for any_errors_fatal 28011 1726882572.16458: done checking for any_errors_fatal 28011 1726882572.16458: checking for max_fail_percentage 28011 1726882572.16459: done checking for max_fail_percentage 28011 1726882572.16460: checking to see if all hosts have failed and the running result is not ok 28011 1726882572.16460: done checking to see if all hosts have failed 28011 1726882572.16460: getting the remaining hosts for this loop 28011 1726882572.16461: done getting the remaining hosts for this loop 28011 1726882572.16462: getting the next task for host managed_node1 28011 1726882572.16465: done getting next task for host managed_node1 28011 1726882572.16466: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 28011 1726882572.16468: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882572.16470: getting variables 28011 1726882572.16470: in VariableManager get_vars() 28011 1726882572.16513: Calling all_inventory to load vars for managed_node1 28011 1726882572.16516: Calling groups_inventory to load vars for managed_node1 28011 1726882572.16517: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882572.16521: Calling all_plugins_play to load vars for managed_node1 28011 1726882572.16523: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882572.16524: Calling groups_plugins_play to load vars for managed_node1 28011 1726882572.17177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882572.18431: done with get_vars() 28011 1726882572.18450: done getting variables 28011 1726882572.18489: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:36:12 -0400 (0:00:00.063) 0:00:41.736 ****** 28011 1726882572.18519: entering _queue_task() for managed_node1/set_fact 28011 1726882572.18821: worker is 1 (out of 1 available) 28011 1726882572.18834: exiting _queue_task() for managed_node1/set_fact 28011 1726882572.18847: done queuing things up, now waiting for results queue to drain 28011 1726882572.18848: waiting for pending results... 28011 1726882572.19139: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 28011 1726882572.19234: in run() - task 12673a56-9f93-962d-7c65-00000000087b 28011 1726882572.19302: variable 'ansible_search_path' from source: unknown 28011 1726882572.19306: variable 'ansible_search_path' from source: unknown 28011 1726882572.19308: calling self._execute() 28011 1726882572.19399: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882572.19412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882572.19428: variable 'omit' from source: magic vars 28011 1726882572.19819: variable 'ansible_distribution_major_version' from source: facts 28011 1726882572.19837: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882572.19847: variable 'omit' from source: magic vars 28011 1726882572.19897: variable 'omit' from source: magic vars 28011 1726882572.19937: variable 'omit' from source: magic vars 28011 1726882572.19997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882572.20024: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882572.20105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882572.20108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882572.20110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882572.20121: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882572.20130: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882572.20138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882572.20587: Set connection var ansible_connection to ssh 28011 1726882572.20591: Set connection var ansible_pipelining to False 28011 1726882572.20599: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882572.20602: Set connection var ansible_shell_executable to /bin/sh 28011 1726882572.20604: Set connection var ansible_timeout to 10 28011 1726882572.20606: Set connection var ansible_shell_type to sh 28011 1726882572.20608: variable 'ansible_shell_executable' from source: unknown 28011 1726882572.20610: variable 'ansible_connection' from source: unknown 28011 1726882572.20612: variable 'ansible_module_compression' from source: unknown 28011 1726882572.20614: variable 'ansible_shell_type' from source: unknown 28011 1726882572.20616: variable 'ansible_shell_executable' from source: unknown 28011 1726882572.20618: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882572.20620: variable 'ansible_pipelining' from source: unknown 28011 1726882572.20623: variable 'ansible_timeout' from source: unknown 28011 1726882572.20625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882572.20732: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882572.20961: variable 'omit' from source: magic vars 28011 1726882572.20964: starting attempt loop 28011 1726882572.20966: running the handler 28011 1726882572.20968: handler run complete 28011 1726882572.20970: attempt loop complete, returning result 28011 1726882572.20971: _execute() done 28011 1726882572.20973: dumping result to json 28011 1726882572.20976: done dumping result, returning 28011 1726882572.20978: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-962d-7c65-00000000087b] 28011 1726882572.20981: sending task result for task 12673a56-9f93-962d-7c65-00000000087b 28011 1726882572.21047: done sending task result for task 12673a56-9f93-962d-7c65-00000000087b 28011 1726882572.21051: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 28011 1726882572.21108: no more pending results, returning what we have 28011 1726882572.21111: results queue empty 28011 1726882572.21112: checking for any_errors_fatal 28011 1726882572.21114: done checking for any_errors_fatal 28011 1726882572.21114: checking for max_fail_percentage 28011 1726882572.21116: done checking for max_fail_percentage 28011 1726882572.21117: checking to see if all hosts have failed and the running result is not ok 28011 1726882572.21118: done checking to see if all hosts have failed 28011 1726882572.21118: getting the remaining hosts for this loop 28011 1726882572.21120: done getting the remaining hosts for this loop 28011 1726882572.21123: getting the next task for host managed_node1 28011 1726882572.21129: done getting next task for host managed_node1 28011 1726882572.21131: ^ task is: TASK: Stat profile file 28011 1726882572.21134: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882572.21138: getting variables 28011 1726882572.21140: in VariableManager get_vars() 28011 1726882572.21165: Calling all_inventory to load vars for managed_node1 28011 1726882572.21167: Calling groups_inventory to load vars for managed_node1 28011 1726882572.21170: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882572.21182: Calling all_plugins_play to load vars for managed_node1 28011 1726882572.21185: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882572.21190: Calling groups_plugins_play to load vars for managed_node1 28011 1726882572.22883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882572.25211: done with get_vars() 28011 1726882572.25234: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:36:12 -0400 (0:00:00.068) 0:00:41.804 ****** 28011 1726882572.25327: entering _queue_task() for managed_node1/stat 28011 1726882572.25658: worker is 1 (out of 1 available) 28011 1726882572.25670: exiting _queue_task() for managed_node1/stat 28011 1726882572.25681: done queuing things up, now waiting for results queue to drain 28011 1726882572.25683: waiting for pending results... 28011 1726882572.26015: running TaskExecutor() for managed_node1/TASK: Stat profile file 28011 1726882572.26134: in run() - task 12673a56-9f93-962d-7c65-00000000087c 28011 1726882572.26138: variable 'ansible_search_path' from source: unknown 28011 1726882572.26141: variable 'ansible_search_path' from source: unknown 28011 1726882572.26162: calling self._execute() 28011 1726882572.26258: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882572.26498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882572.26502: variable 'omit' from source: magic vars 28011 1726882572.26651: variable 'ansible_distribution_major_version' from source: facts 28011 1726882572.26670: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882572.26681: variable 'omit' from source: magic vars 28011 1726882572.26731: variable 'omit' from source: magic vars 28011 1726882572.26834: variable 'profile' from source: include params 28011 1726882572.26845: variable 'interface' from source: set_fact 28011 1726882572.26916: variable 'interface' from source: set_fact 28011 1726882572.26937: variable 'omit' from source: magic vars 28011 1726882572.26985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882572.27027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882572.27054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882572.27098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882572.27102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882572.27129: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882572.27138: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882572.27170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882572.27249: Set connection var ansible_connection to ssh 28011 1726882572.27262: Set connection var ansible_pipelining to False 28011 1726882572.27271: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882572.27299: Set connection var ansible_shell_executable to /bin/sh 28011 1726882572.27302: Set connection var ansible_timeout to 10 28011 1726882572.27311: Set connection var ansible_shell_type to sh 28011 1726882572.27339: variable 'ansible_shell_executable' from source: unknown 28011 1726882572.27346: variable 'ansible_connection' from source: unknown 28011 1726882572.27409: variable 'ansible_module_compression' from source: unknown 28011 1726882572.27412: variable 'ansible_shell_type' from source: unknown 28011 1726882572.27414: variable 'ansible_shell_executable' from source: unknown 28011 1726882572.27416: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882572.27418: variable 'ansible_pipelining' from source: unknown 28011 1726882572.27420: variable 'ansible_timeout' from source: unknown 28011 1726882572.27422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882572.27590: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882572.27608: variable 'omit' from source: magic vars 28011 1726882572.27618: starting attempt loop 28011 1726882572.27630: running the handler 28011 1726882572.27648: _low_level_execute_command(): starting 28011 1726882572.27659: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882572.28389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882572.28408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.28478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882572.28510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882572.28721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882572.30457: stdout chunk (state=3): >>>/root <<< 28011 1726882572.30481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882572.30487: stdout chunk (state=3): >>><<< 28011 1726882572.30500: stderr chunk (state=3): >>><<< 28011 1726882572.30519: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882572.30535: _low_level_execute_command(): starting 28011 1726882572.30540: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819 `" && echo ansible-tmp-1726882572.3052013-29891-45421929093819="` echo /root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819 `" ) && sleep 0' 28011 1726882572.31658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882572.31668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882572.31679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882572.31699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882572.31711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882572.31723: stderr chunk (state=3): >>>debug2: match not found <<< 28011 1726882572.31726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.31814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882572.31822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882572.31870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882572.31907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882572.33812: stdout chunk (state=3): >>>ansible-tmp-1726882572.3052013-29891-45421929093819=/root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819 <<< 28011 1726882572.33999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882572.34011: stdout chunk (state=3): >>><<< 28011 1726882572.34025: stderr chunk (state=3): >>><<< 28011 1726882572.34051: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882572.3052013-29891-45421929093819=/root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882572.34111: variable 'ansible_module_compression' from source: unknown 28011 1726882572.34179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28011 1726882572.34230: variable 'ansible_facts' from source: unknown 28011 1726882572.34319: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819/AnsiballZ_stat.py 28011 1726882572.34730: Sending initial data 28011 1726882572.34734: Sent initial data (152 bytes) 28011 1726882572.35423: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882572.35442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882572.35458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882572.35474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882572.35496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882572.35562: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.35606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882572.35626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882572.35649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882572.35719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882572.37229: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882572.37276: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882572.37329: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpugl3mmcw /root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819/AnsiballZ_stat.py <<< 28011 1726882572.37333: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819/AnsiballZ_stat.py" <<< 28011 1726882572.37379: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpugl3mmcw" to remote "/root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819/AnsiballZ_stat.py" <<< 28011 1726882572.38033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882572.38142: stderr chunk (state=3): >>><<< 28011 1726882572.38145: stdout chunk (state=3): >>><<< 28011 1726882572.38147: done transferring module to remote 28011 1726882572.38149: _low_level_execute_command(): starting 28011 1726882572.38152: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819/ /root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819/AnsiballZ_stat.py && sleep 0' 28011 1726882572.38724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882572.38738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882572.38808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.38860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882572.38875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882572.38898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882572.38972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882572.40842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882572.40845: stdout chunk (state=3): >>><<< 28011 1726882572.40847: stderr chunk (state=3): >>><<< 28011 1726882572.40850: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882572.40853: _low_level_execute_command(): starting 28011 1726882572.40856: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819/AnsiballZ_stat.py && sleep 0' 28011 1726882572.41378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882572.41399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882572.41415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882572.41430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882572.41449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882572.41507: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.41554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882572.41564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882572.41581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882572.41681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882572.56539: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28011 1726882572.57912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882572.57916: stdout chunk (state=3): >>><<< 28011 1726882572.57918: stderr chunk (state=3): >>><<< 28011 1726882572.57921: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882572.57924: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882572.57926: _low_level_execute_command(): starting 28011 1726882572.57928: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882572.3052013-29891-45421929093819/ > /dev/null 2>&1 && sleep 0' 28011 1726882572.58587: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882572.58608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882572.58714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882572.58748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882572.60589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882572.60602: stderr chunk (state=3): >>><<< 28011 1726882572.60605: stdout chunk (state=3): >>><<< 28011 1726882572.60618: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882572.60630: handler run complete 28011 1726882572.60642: attempt loop complete, returning result 28011 1726882572.60645: _execute() done 28011 1726882572.60648: dumping result to json 28011 1726882572.60650: done dumping result, returning 28011 1726882572.60658: done running TaskExecutor() for managed_node1/TASK: Stat profile file [12673a56-9f93-962d-7c65-00000000087c] 28011 1726882572.60660: sending task result for task 12673a56-9f93-962d-7c65-00000000087c ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 28011 1726882572.60814: no more pending results, returning what we have 28011 1726882572.60818: results queue empty 28011 1726882572.60818: checking for any_errors_fatal 28011 1726882572.60827: done checking for any_errors_fatal 28011 1726882572.60827: checking for max_fail_percentage 28011 1726882572.60829: done checking for max_fail_percentage 28011 1726882572.60830: checking to see if all hosts have failed and the running result is not ok 28011 1726882572.60830: done checking to see if all hosts have failed 28011 1726882572.60831: getting the remaining hosts for this loop 28011 1726882572.60832: done getting the remaining hosts for this loop 28011 1726882572.60837: getting the next task for host managed_node1 28011 1726882572.60846: done getting next task for host managed_node1 28011 1726882572.60848: ^ task is: TASK: Set NM profile exist flag based on the profile files 28011 1726882572.60851: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882572.60855: getting variables 28011 1726882572.60857: in VariableManager get_vars() 28011 1726882572.60886: Calling all_inventory to load vars for managed_node1 28011 1726882572.60888: Calling groups_inventory to load vars for managed_node1 28011 1726882572.60891: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882572.61059: Calling all_plugins_play to load vars for managed_node1 28011 1726882572.61063: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882572.61066: Calling groups_plugins_play to load vars for managed_node1 28011 1726882572.61883: done sending task result for task 12673a56-9f93-962d-7c65-00000000087c 28011 1726882572.61887: WORKER PROCESS EXITING 28011 1726882572.61900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882572.63507: done with get_vars() 28011 1726882572.63529: done getting variables 28011 1726882572.63586: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:36:12 -0400 (0:00:00.382) 0:00:42.187 ****** 28011 1726882572.63626: entering _queue_task() for managed_node1/set_fact 28011 1726882572.63961: worker is 1 (out of 1 available) 28011 1726882572.63974: exiting _queue_task() for managed_node1/set_fact 28011 1726882572.63985: done queuing things up, now waiting for results queue to drain 28011 1726882572.63987: waiting for pending results... 28011 1726882572.64167: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 28011 1726882572.64249: in run() - task 12673a56-9f93-962d-7c65-00000000087d 28011 1726882572.64260: variable 'ansible_search_path' from source: unknown 28011 1726882572.64266: variable 'ansible_search_path' from source: unknown 28011 1726882572.64297: calling self._execute() 28011 1726882572.64368: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882572.64377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882572.64389: variable 'omit' from source: magic vars 28011 1726882572.64670: variable 'ansible_distribution_major_version' from source: facts 28011 1726882572.64680: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882572.64768: variable 'profile_stat' from source: set_fact 28011 1726882572.64778: Evaluated conditional (profile_stat.stat.exists): False 28011 1726882572.64782: when evaluation is False, skipping this task 28011 1726882572.64784: _execute() done 28011 1726882572.64790: dumping result to json 28011 1726882572.64794: done dumping result, returning 28011 1726882572.64800: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-962d-7c65-00000000087d] 28011 1726882572.64809: sending task result for task 12673a56-9f93-962d-7c65-00000000087d 28011 1726882572.64884: done sending task result for task 12673a56-9f93-962d-7c65-00000000087d 28011 1726882572.64889: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28011 1726882572.64933: no more pending results, returning what we have 28011 1726882572.64937: results queue empty 28011 1726882572.64938: checking for any_errors_fatal 28011 1726882572.64948: done checking for any_errors_fatal 28011 1726882572.64949: checking for max_fail_percentage 28011 1726882572.64951: done checking for max_fail_percentage 28011 1726882572.64952: checking to see if all hosts have failed and the running result is not ok 28011 1726882572.64952: done checking to see if all hosts have failed 28011 1726882572.64953: getting the remaining hosts for this loop 28011 1726882572.64954: done getting the remaining hosts for this loop 28011 1726882572.64958: getting the next task for host managed_node1 28011 1726882572.64964: done getting next task for host managed_node1 28011 1726882572.64966: ^ task is: TASK: Get NM profile info 28011 1726882572.64969: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882572.64974: getting variables 28011 1726882572.64975: in VariableManager get_vars() 28011 1726882572.65005: Calling all_inventory to load vars for managed_node1 28011 1726882572.65007: Calling groups_inventory to load vars for managed_node1 28011 1726882572.65010: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882572.65020: Calling all_plugins_play to load vars for managed_node1 28011 1726882572.65023: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882572.65026: Calling groups_plugins_play to load vars for managed_node1 28011 1726882572.66257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882572.67742: done with get_vars() 28011 1726882572.67768: done getting variables 28011 1726882572.67869: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:36:12 -0400 (0:00:00.042) 0:00:42.230 ****** 28011 1726882572.67902: entering _queue_task() for managed_node1/shell 28011 1726882572.67903: Creating lock for shell 28011 1726882572.68158: worker is 1 (out of 1 available) 28011 1726882572.68171: exiting _queue_task() for managed_node1/shell 28011 1726882572.68184: done queuing things up, now waiting for results queue to drain 28011 1726882572.68195: waiting for pending results... 28011 1726882572.68355: running TaskExecutor() for managed_node1/TASK: Get NM profile info 28011 1726882572.68425: in run() - task 12673a56-9f93-962d-7c65-00000000087e 28011 1726882572.68436: variable 'ansible_search_path' from source: unknown 28011 1726882572.68439: variable 'ansible_search_path' from source: unknown 28011 1726882572.68465: calling self._execute() 28011 1726882572.68540: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882572.68544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882572.68554: variable 'omit' from source: magic vars 28011 1726882572.68881: variable 'ansible_distribution_major_version' from source: facts 28011 1726882572.69099: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882572.69102: variable 'omit' from source: magic vars 28011 1726882572.69104: variable 'omit' from source: magic vars 28011 1726882572.69106: variable 'profile' from source: include params 28011 1726882572.69108: variable 'interface' from source: set_fact 28011 1726882572.69116: variable 'interface' from source: set_fact 28011 1726882572.69136: variable 'omit' from source: magic vars 28011 1726882572.69173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882572.69210: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882572.69235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882572.69253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882572.69266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882572.69302: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882572.69310: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882572.69316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882572.69404: Set connection var ansible_connection to ssh 28011 1726882572.69415: Set connection var ansible_pipelining to False 28011 1726882572.69423: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882572.69430: Set connection var ansible_shell_executable to /bin/sh 28011 1726882572.69439: Set connection var ansible_timeout to 10 28011 1726882572.69446: Set connection var ansible_shell_type to sh 28011 1726882572.69471: variable 'ansible_shell_executable' from source: unknown 28011 1726882572.69476: variable 'ansible_connection' from source: unknown 28011 1726882572.69482: variable 'ansible_module_compression' from source: unknown 28011 1726882572.69487: variable 'ansible_shell_type' from source: unknown 28011 1726882572.69492: variable 'ansible_shell_executable' from source: unknown 28011 1726882572.69499: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882572.69505: variable 'ansible_pipelining' from source: unknown 28011 1726882572.69510: variable 'ansible_timeout' from source: unknown 28011 1726882572.69516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882572.69637: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882572.69734: variable 'omit' from source: magic vars 28011 1726882572.69760: starting attempt loop 28011 1726882572.69803: running the handler 28011 1726882572.69817: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882572.69911: _low_level_execute_command(): starting 28011 1726882572.70216: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882572.71049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882572.71059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882572.71072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882572.71085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882572.71100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.71141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882572.71153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882572.71211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882572.72772: stdout chunk (state=3): >>>/root <<< 28011 1726882572.72866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882572.72895: stderr chunk (state=3): >>><<< 28011 1726882572.72899: stdout chunk (state=3): >>><<< 28011 1726882572.72916: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882572.72929: _low_level_execute_command(): starting 28011 1726882572.72934: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823 `" && echo ansible-tmp-1726882572.729177-29916-10514573902823="` echo /root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823 `" ) && sleep 0' 28011 1726882572.73356: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882572.73359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.73362: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882572.73364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882572.73366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.73411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882572.73414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882572.73460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882572.75308: stdout chunk (state=3): >>>ansible-tmp-1726882572.729177-29916-10514573902823=/root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823 <<< 28011 1726882572.75417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882572.75440: stderr chunk (state=3): >>><<< 28011 1726882572.75444: stdout chunk (state=3): >>><<< 28011 1726882572.75459: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882572.729177-29916-10514573902823=/root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882572.75484: variable 'ansible_module_compression' from source: unknown 28011 1726882572.75528: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28011 1726882572.75560: variable 'ansible_facts' from source: unknown 28011 1726882572.75615: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823/AnsiballZ_command.py 28011 1726882572.75713: Sending initial data 28011 1726882572.75716: Sent initial data (154 bytes) 28011 1726882572.76145: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882572.76149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882572.76151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882572.76155: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882572.76158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.76207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882572.76210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882572.76254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882572.77759: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28011 1726882572.77768: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882572.77803: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882572.77842: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpympxhv_t /root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823/AnsiballZ_command.py <<< 28011 1726882572.77850: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823/AnsiballZ_command.py" <<< 28011 1726882572.77885: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpympxhv_t" to remote "/root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823/AnsiballZ_command.py" <<< 28011 1726882572.78410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882572.78449: stderr chunk (state=3): >>><<< 28011 1726882572.78452: stdout chunk (state=3): >>><<< 28011 1726882572.78498: done transferring module to remote 28011 1726882572.78501: _low_level_execute_command(): starting 28011 1726882572.78503: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823/ /root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823/AnsiballZ_command.py && sleep 0' 28011 1726882572.78924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882572.78927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882572.78932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.78934: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882572.78936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.78987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882572.78991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882572.79031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882572.80735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882572.80757: stderr chunk (state=3): >>><<< 28011 1726882572.80760: stdout chunk (state=3): >>><<< 28011 1726882572.80774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882572.80778: _low_level_execute_command(): starting 28011 1726882572.80780: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823/AnsiballZ_command.py && sleep 0' 28011 1726882572.81209: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882572.81212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.81215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882572.81217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882572.81219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882572.81221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.81265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882572.81268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882572.81321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882572.97895: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 21:36:12.962130", "end": "2024-09-20 21:36:12.977804", "delta": "0:00:00.015674", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882572.99242: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 28011 1726882572.99269: stderr chunk (state=3): >>><<< 28011 1726882572.99273: stdout chunk (state=3): >>><<< 28011 1726882572.99296: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 21:36:12.962130", "end": "2024-09-20 21:36:12.977804", "delta": "0:00:00.015674", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 28011 1726882572.99325: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882572.99334: _low_level_execute_command(): starting 28011 1726882572.99336: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882572.729177-29916-10514573902823/ > /dev/null 2>&1 && sleep 0' 28011 1726882572.99777: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882572.99780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882572.99782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.99784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882572.99789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882572.99842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882572.99851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882572.99854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882572.99889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882573.01716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882573.01719: stdout chunk (state=3): >>><<< 28011 1726882573.01722: stderr chunk (state=3): >>><<< 28011 1726882573.01846: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882573.01849: handler run complete 28011 1726882573.01852: Evaluated conditional (False): False 28011 1726882573.01854: attempt loop complete, returning result 28011 1726882573.01856: _execute() done 28011 1726882573.01858: dumping result to json 28011 1726882573.01860: done dumping result, returning 28011 1726882573.01862: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [12673a56-9f93-962d-7c65-00000000087e] 28011 1726882573.01864: sending task result for task 12673a56-9f93-962d-7c65-00000000087e 28011 1726882573.01929: done sending task result for task 12673a56-9f93-962d-7c65-00000000087e 28011 1726882573.01932: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.015674", "end": "2024-09-20 21:36:12.977804", "rc": 1, "start": "2024-09-20 21:36:12.962130" } MSG: non-zero return code ...ignoring 28011 1726882573.02005: no more pending results, returning what we have 28011 1726882573.02008: results queue empty 28011 1726882573.02009: checking for any_errors_fatal 28011 1726882573.02015: done checking for any_errors_fatal 28011 1726882573.02016: checking for max_fail_percentage 28011 1726882573.02018: done checking for max_fail_percentage 28011 1726882573.02019: checking to see if all hosts have failed and the running result is not ok 28011 1726882573.02020: done checking to see if all hosts have failed 28011 1726882573.02020: getting the remaining hosts for this loop 28011 1726882573.02022: done getting the remaining hosts for this loop 28011 1726882573.02025: getting the next task for host managed_node1 28011 1726882573.02031: done getting next task for host managed_node1 28011 1726882573.02035: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28011 1726882573.02038: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882573.02043: getting variables 28011 1726882573.02045: in VariableManager get_vars() 28011 1726882573.02072: Calling all_inventory to load vars for managed_node1 28011 1726882573.02074: Calling groups_inventory to load vars for managed_node1 28011 1726882573.02078: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882573.02088: Calling all_plugins_play to load vars for managed_node1 28011 1726882573.02091: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882573.02231: Calling groups_plugins_play to load vars for managed_node1 28011 1726882573.03676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882573.05300: done with get_vars() 28011 1726882573.05322: done getting variables 28011 1726882573.05380: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:36:13 -0400 (0:00:00.375) 0:00:42.605 ****** 28011 1726882573.05416: entering _queue_task() for managed_node1/set_fact 28011 1726882573.05736: worker is 1 (out of 1 available) 28011 1726882573.05748: exiting _queue_task() for managed_node1/set_fact 28011 1726882573.05760: done queuing things up, now waiting for results queue to drain 28011 1726882573.05762: waiting for pending results... 28011 1726882573.06122: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28011 1726882573.06219: in run() - task 12673a56-9f93-962d-7c65-00000000087f 28011 1726882573.06225: variable 'ansible_search_path' from source: unknown 28011 1726882573.06328: variable 'ansible_search_path' from source: unknown 28011 1726882573.06333: calling self._execute() 28011 1726882573.06373: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.06389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.06411: variable 'omit' from source: magic vars 28011 1726882573.06805: variable 'ansible_distribution_major_version' from source: facts 28011 1726882573.06823: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882573.06956: variable 'nm_profile_exists' from source: set_fact 28011 1726882573.06978: Evaluated conditional (nm_profile_exists.rc == 0): False 28011 1726882573.06991: when evaluation is False, skipping this task 28011 1726882573.07001: _execute() done 28011 1726882573.07009: dumping result to json 28011 1726882573.07017: done dumping result, returning 28011 1726882573.07027: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-962d-7c65-00000000087f] 28011 1726882573.07038: sending task result for task 12673a56-9f93-962d-7c65-00000000087f skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 28011 1726882573.07282: no more pending results, returning what we have 28011 1726882573.07289: results queue empty 28011 1726882573.07290: checking for any_errors_fatal 28011 1726882573.07301: done checking for any_errors_fatal 28011 1726882573.07302: checking for max_fail_percentage 28011 1726882573.07304: done checking for max_fail_percentage 28011 1726882573.07305: checking to see if all hosts have failed and the running result is not ok 28011 1726882573.07307: done checking to see if all hosts have failed 28011 1726882573.07307: getting the remaining hosts for this loop 28011 1726882573.07309: done getting the remaining hosts for this loop 28011 1726882573.07313: getting the next task for host managed_node1 28011 1726882573.07324: done getting next task for host managed_node1 28011 1726882573.07326: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 28011 1726882573.07330: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882573.07335: getting variables 28011 1726882573.07337: in VariableManager get_vars() 28011 1726882573.07365: Calling all_inventory to load vars for managed_node1 28011 1726882573.07368: Calling groups_inventory to load vars for managed_node1 28011 1726882573.07372: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882573.07506: Calling all_plugins_play to load vars for managed_node1 28011 1726882573.07510: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882573.07514: Calling groups_plugins_play to load vars for managed_node1 28011 1726882573.08106: done sending task result for task 12673a56-9f93-962d-7c65-00000000087f 28011 1726882573.08110: WORKER PROCESS EXITING 28011 1726882573.09053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882573.10637: done with get_vars() 28011 1726882573.10656: done getting variables 28011 1726882573.10717: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882573.10835: variable 'profile' from source: include params 28011 1726882573.10839: variable 'interface' from source: set_fact 28011 1726882573.10907: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:36:13 -0400 (0:00:00.055) 0:00:42.660 ****** 28011 1726882573.10936: entering _queue_task() for managed_node1/command 28011 1726882573.11404: worker is 1 (out of 1 available) 28011 1726882573.11414: exiting _queue_task() for managed_node1/command 28011 1726882573.11424: done queuing things up, now waiting for results queue to drain 28011 1726882573.11425: waiting for pending results... 28011 1726882573.11492: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 28011 1726882573.11615: in run() - task 12673a56-9f93-962d-7c65-000000000881 28011 1726882573.11636: variable 'ansible_search_path' from source: unknown 28011 1726882573.11647: variable 'ansible_search_path' from source: unknown 28011 1726882573.11692: calling self._execute() 28011 1726882573.11792: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.11809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.11824: variable 'omit' from source: magic vars 28011 1726882573.12192: variable 'ansible_distribution_major_version' from source: facts 28011 1726882573.12215: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882573.12342: variable 'profile_stat' from source: set_fact 28011 1726882573.12399: Evaluated conditional (profile_stat.stat.exists): False 28011 1726882573.12402: when evaluation is False, skipping this task 28011 1726882573.12405: _execute() done 28011 1726882573.12407: dumping result to json 28011 1726882573.12409: done dumping result, returning 28011 1726882573.12411: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [12673a56-9f93-962d-7c65-000000000881] 28011 1726882573.12419: sending task result for task 12673a56-9f93-962d-7c65-000000000881 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28011 1726882573.12572: no more pending results, returning what we have 28011 1726882573.12576: results queue empty 28011 1726882573.12577: checking for any_errors_fatal 28011 1726882573.12582: done checking for any_errors_fatal 28011 1726882573.12583: checking for max_fail_percentage 28011 1726882573.12585: done checking for max_fail_percentage 28011 1726882573.12588: checking to see if all hosts have failed and the running result is not ok 28011 1726882573.12590: done checking to see if all hosts have failed 28011 1726882573.12590: getting the remaining hosts for this loop 28011 1726882573.12591: done getting the remaining hosts for this loop 28011 1726882573.12597: getting the next task for host managed_node1 28011 1726882573.12604: done getting next task for host managed_node1 28011 1726882573.12607: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 28011 1726882573.12610: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882573.12615: getting variables 28011 1726882573.12616: in VariableManager get_vars() 28011 1726882573.12644: Calling all_inventory to load vars for managed_node1 28011 1726882573.12646: Calling groups_inventory to load vars for managed_node1 28011 1726882573.12650: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882573.12662: Calling all_plugins_play to load vars for managed_node1 28011 1726882573.12666: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882573.12669: Calling groups_plugins_play to load vars for managed_node1 28011 1726882573.13307: done sending task result for task 12673a56-9f93-962d-7c65-000000000881 28011 1726882573.13311: WORKER PROCESS EXITING 28011 1726882573.14212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882573.15962: done with get_vars() 28011 1726882573.15984: done getting variables 28011 1726882573.16044: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882573.16149: variable 'profile' from source: include params 28011 1726882573.16153: variable 'interface' from source: set_fact 28011 1726882573.16214: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:36:13 -0400 (0:00:00.053) 0:00:42.713 ****** 28011 1726882573.16246: entering _queue_task() for managed_node1/set_fact 28011 1726882573.16726: worker is 1 (out of 1 available) 28011 1726882573.16736: exiting _queue_task() for managed_node1/set_fact 28011 1726882573.16746: done queuing things up, now waiting for results queue to drain 28011 1726882573.16748: waiting for pending results... 28011 1726882573.16839: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 28011 1726882573.16965: in run() - task 12673a56-9f93-962d-7c65-000000000882 28011 1726882573.16997: variable 'ansible_search_path' from source: unknown 28011 1726882573.17006: variable 'ansible_search_path' from source: unknown 28011 1726882573.17044: calling self._execute() 28011 1726882573.17141: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.17152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.17168: variable 'omit' from source: magic vars 28011 1726882573.17535: variable 'ansible_distribution_major_version' from source: facts 28011 1726882573.17550: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882573.17670: variable 'profile_stat' from source: set_fact 28011 1726882573.17689: Evaluated conditional (profile_stat.stat.exists): False 28011 1726882573.17697: when evaluation is False, skipping this task 28011 1726882573.17732: _execute() done 28011 1726882573.17735: dumping result to json 28011 1726882573.17738: done dumping result, returning 28011 1726882573.17740: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [12673a56-9f93-962d-7c65-000000000882] 28011 1726882573.17742: sending task result for task 12673a56-9f93-962d-7c65-000000000882 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28011 1726882573.17879: no more pending results, returning what we have 28011 1726882573.17883: results queue empty 28011 1726882573.17884: checking for any_errors_fatal 28011 1726882573.17896: done checking for any_errors_fatal 28011 1726882573.17897: checking for max_fail_percentage 28011 1726882573.17898: done checking for max_fail_percentage 28011 1726882573.17899: checking to see if all hosts have failed and the running result is not ok 28011 1726882573.17900: done checking to see if all hosts have failed 28011 1726882573.17901: getting the remaining hosts for this loop 28011 1726882573.17902: done getting the remaining hosts for this loop 28011 1726882573.17906: getting the next task for host managed_node1 28011 1726882573.17913: done getting next task for host managed_node1 28011 1726882573.17916: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 28011 1726882573.17920: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882573.17924: getting variables 28011 1726882573.17926: in VariableManager get_vars() 28011 1726882573.17952: Calling all_inventory to load vars for managed_node1 28011 1726882573.17954: Calling groups_inventory to load vars for managed_node1 28011 1726882573.17958: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882573.17970: Calling all_plugins_play to load vars for managed_node1 28011 1726882573.17973: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882573.17976: Calling groups_plugins_play to load vars for managed_node1 28011 1726882573.18607: done sending task result for task 12673a56-9f93-962d-7c65-000000000882 28011 1726882573.18610: WORKER PROCESS EXITING 28011 1726882573.19538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882573.21149: done with get_vars() 28011 1726882573.21168: done getting variables 28011 1726882573.21229: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882573.21329: variable 'profile' from source: include params 28011 1726882573.21333: variable 'interface' from source: set_fact 28011 1726882573.21389: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:36:13 -0400 (0:00:00.051) 0:00:42.765 ****** 28011 1726882573.21420: entering _queue_task() for managed_node1/command 28011 1726882573.21676: worker is 1 (out of 1 available) 28011 1726882573.21689: exiting _queue_task() for managed_node1/command 28011 1726882573.21804: done queuing things up, now waiting for results queue to drain 28011 1726882573.21806: waiting for pending results... 28011 1726882573.21964: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 28011 1726882573.22081: in run() - task 12673a56-9f93-962d-7c65-000000000883 28011 1726882573.22106: variable 'ansible_search_path' from source: unknown 28011 1726882573.22113: variable 'ansible_search_path' from source: unknown 28011 1726882573.22155: calling self._execute() 28011 1726882573.22251: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.22262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.22277: variable 'omit' from source: magic vars 28011 1726882573.22638: variable 'ansible_distribution_major_version' from source: facts 28011 1726882573.22654: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882573.22778: variable 'profile_stat' from source: set_fact 28011 1726882573.22899: Evaluated conditional (profile_stat.stat.exists): False 28011 1726882573.22902: when evaluation is False, skipping this task 28011 1726882573.22905: _execute() done 28011 1726882573.22907: dumping result to json 28011 1726882573.22909: done dumping result, returning 28011 1726882573.22912: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 [12673a56-9f93-962d-7c65-000000000883] 28011 1726882573.22914: sending task result for task 12673a56-9f93-962d-7c65-000000000883 28011 1726882573.22972: done sending task result for task 12673a56-9f93-962d-7c65-000000000883 28011 1726882573.22976: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28011 1726882573.23025: no more pending results, returning what we have 28011 1726882573.23029: results queue empty 28011 1726882573.23030: checking for any_errors_fatal 28011 1726882573.23038: done checking for any_errors_fatal 28011 1726882573.23039: checking for max_fail_percentage 28011 1726882573.23041: done checking for max_fail_percentage 28011 1726882573.23042: checking to see if all hosts have failed and the running result is not ok 28011 1726882573.23043: done checking to see if all hosts have failed 28011 1726882573.23043: getting the remaining hosts for this loop 28011 1726882573.23044: done getting the remaining hosts for this loop 28011 1726882573.23048: getting the next task for host managed_node1 28011 1726882573.23055: done getting next task for host managed_node1 28011 1726882573.23057: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 28011 1726882573.23062: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882573.23067: getting variables 28011 1726882573.23068: in VariableManager get_vars() 28011 1726882573.23099: Calling all_inventory to load vars for managed_node1 28011 1726882573.23101: Calling groups_inventory to load vars for managed_node1 28011 1726882573.23105: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882573.23117: Calling all_plugins_play to load vars for managed_node1 28011 1726882573.23120: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882573.23123: Calling groups_plugins_play to load vars for managed_node1 28011 1726882573.24738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882573.26292: done with get_vars() 28011 1726882573.26314: done getting variables 28011 1726882573.26367: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882573.26467: variable 'profile' from source: include params 28011 1726882573.26470: variable 'interface' from source: set_fact 28011 1726882573.26529: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:36:13 -0400 (0:00:00.051) 0:00:42.816 ****** 28011 1726882573.26558: entering _queue_task() for managed_node1/set_fact 28011 1726882573.27014: worker is 1 (out of 1 available) 28011 1726882573.27022: exiting _queue_task() for managed_node1/set_fact 28011 1726882573.27032: done queuing things up, now waiting for results queue to drain 28011 1726882573.27034: waiting for pending results... 28011 1726882573.27163: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 28011 1726882573.27239: in run() - task 12673a56-9f93-962d-7c65-000000000884 28011 1726882573.27265: variable 'ansible_search_path' from source: unknown 28011 1726882573.27299: variable 'ansible_search_path' from source: unknown 28011 1726882573.27318: calling self._execute() 28011 1726882573.27414: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.27426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.27477: variable 'omit' from source: magic vars 28011 1726882573.27817: variable 'ansible_distribution_major_version' from source: facts 28011 1726882573.27834: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882573.27961: variable 'profile_stat' from source: set_fact 28011 1726882573.27980: Evaluated conditional (profile_stat.stat.exists): False 28011 1726882573.27992: when evaluation is False, skipping this task 28011 1726882573.28017: _execute() done 28011 1726882573.28020: dumping result to json 28011 1726882573.28022: done dumping result, returning 28011 1726882573.28127: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [12673a56-9f93-962d-7c65-000000000884] 28011 1726882573.28131: sending task result for task 12673a56-9f93-962d-7c65-000000000884 28011 1726882573.28191: done sending task result for task 12673a56-9f93-962d-7c65-000000000884 28011 1726882573.28197: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28011 1726882573.28273: no more pending results, returning what we have 28011 1726882573.28277: results queue empty 28011 1726882573.28278: checking for any_errors_fatal 28011 1726882573.28284: done checking for any_errors_fatal 28011 1726882573.28285: checking for max_fail_percentage 28011 1726882573.28290: done checking for max_fail_percentage 28011 1726882573.28291: checking to see if all hosts have failed and the running result is not ok 28011 1726882573.28292: done checking to see if all hosts have failed 28011 1726882573.28294: getting the remaining hosts for this loop 28011 1726882573.28297: done getting the remaining hosts for this loop 28011 1726882573.28300: getting the next task for host managed_node1 28011 1726882573.28310: done getting next task for host managed_node1 28011 1726882573.28313: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 28011 1726882573.28317: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882573.28322: getting variables 28011 1726882573.28324: in VariableManager get_vars() 28011 1726882573.28350: Calling all_inventory to load vars for managed_node1 28011 1726882573.28353: Calling groups_inventory to load vars for managed_node1 28011 1726882573.28357: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882573.28369: Calling all_plugins_play to load vars for managed_node1 28011 1726882573.28373: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882573.28376: Calling groups_plugins_play to load vars for managed_node1 28011 1726882573.29848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882573.31430: done with get_vars() 28011 1726882573.31450: done getting variables 28011 1726882573.31511: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882573.31619: variable 'profile' from source: include params 28011 1726882573.31623: variable 'interface' from source: set_fact 28011 1726882573.31677: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:36:13 -0400 (0:00:00.051) 0:00:42.868 ****** 28011 1726882573.31712: entering _queue_task() for managed_node1/assert 28011 1726882573.31974: worker is 1 (out of 1 available) 28011 1726882573.31985: exiting _queue_task() for managed_node1/assert 28011 1726882573.32103: done queuing things up, now waiting for results queue to drain 28011 1726882573.32106: waiting for pending results... 28011 1726882573.32272: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'ethtest0' 28011 1726882573.32599: in run() - task 12673a56-9f93-962d-7c65-00000000086d 28011 1726882573.32603: variable 'ansible_search_path' from source: unknown 28011 1726882573.32605: variable 'ansible_search_path' from source: unknown 28011 1726882573.32608: calling self._execute() 28011 1726882573.32611: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.32613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.32615: variable 'omit' from source: magic vars 28011 1726882573.32920: variable 'ansible_distribution_major_version' from source: facts 28011 1726882573.32937: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882573.32953: variable 'omit' from source: magic vars 28011 1726882573.32996: variable 'omit' from source: magic vars 28011 1726882573.33103: variable 'profile' from source: include params 28011 1726882573.33112: variable 'interface' from source: set_fact 28011 1726882573.33180: variable 'interface' from source: set_fact 28011 1726882573.33210: variable 'omit' from source: magic vars 28011 1726882573.33254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882573.33302: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882573.33329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882573.33351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882573.33367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882573.33410: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882573.33420: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.33429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.33534: Set connection var ansible_connection to ssh 28011 1726882573.33548: Set connection var ansible_pipelining to False 28011 1726882573.33558: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882573.33568: Set connection var ansible_shell_executable to /bin/sh 28011 1726882573.33581: Set connection var ansible_timeout to 10 28011 1726882573.33600: Set connection var ansible_shell_type to sh 28011 1726882573.33699: variable 'ansible_shell_executable' from source: unknown 28011 1726882573.33707: variable 'ansible_connection' from source: unknown 28011 1726882573.33711: variable 'ansible_module_compression' from source: unknown 28011 1726882573.33713: variable 'ansible_shell_type' from source: unknown 28011 1726882573.33715: variable 'ansible_shell_executable' from source: unknown 28011 1726882573.33717: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.33719: variable 'ansible_pipelining' from source: unknown 28011 1726882573.33722: variable 'ansible_timeout' from source: unknown 28011 1726882573.33724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.33829: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882573.33846: variable 'omit' from source: magic vars 28011 1726882573.33856: starting attempt loop 28011 1726882573.33863: running the handler 28011 1726882573.33994: variable 'lsr_net_profile_exists' from source: set_fact 28011 1726882573.34007: Evaluated conditional (not lsr_net_profile_exists): True 28011 1726882573.34019: handler run complete 28011 1726882573.34042: attempt loop complete, returning result 28011 1726882573.34099: _execute() done 28011 1726882573.34102: dumping result to json 28011 1726882573.34104: done dumping result, returning 28011 1726882573.34106: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'ethtest0' [12673a56-9f93-962d-7c65-00000000086d] 28011 1726882573.34108: sending task result for task 12673a56-9f93-962d-7c65-00000000086d 28011 1726882573.34299: done sending task result for task 12673a56-9f93-962d-7c65-00000000086d 28011 1726882573.34303: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 28011 1726882573.34346: no more pending results, returning what we have 28011 1726882573.34349: results queue empty 28011 1726882573.34350: checking for any_errors_fatal 28011 1726882573.34355: done checking for any_errors_fatal 28011 1726882573.34356: checking for max_fail_percentage 28011 1726882573.34357: done checking for max_fail_percentage 28011 1726882573.34358: checking to see if all hosts have failed and the running result is not ok 28011 1726882573.34359: done checking to see if all hosts have failed 28011 1726882573.34360: getting the remaining hosts for this loop 28011 1726882573.34361: done getting the remaining hosts for this loop 28011 1726882573.34365: getting the next task for host managed_node1 28011 1726882573.34372: done getting next task for host managed_node1 28011 1726882573.34375: ^ task is: TASK: Include the task 'assert_device_absent.yml' 28011 1726882573.34377: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882573.34381: getting variables 28011 1726882573.34383: in VariableManager get_vars() 28011 1726882573.34414: Calling all_inventory to load vars for managed_node1 28011 1726882573.34417: Calling groups_inventory to load vars for managed_node1 28011 1726882573.34421: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882573.34431: Calling all_plugins_play to load vars for managed_node1 28011 1726882573.34435: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882573.34439: Calling groups_plugins_play to load vars for managed_node1 28011 1726882573.36035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882573.37709: done with get_vars() 28011 1726882573.37731: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:156 Friday 20 September 2024 21:36:13 -0400 (0:00:00.061) 0:00:42.929 ****** 28011 1726882573.37820: entering _queue_task() for managed_node1/include_tasks 28011 1726882573.38267: worker is 1 (out of 1 available) 28011 1726882573.38280: exiting _queue_task() for managed_node1/include_tasks 28011 1726882573.38292: done queuing things up, now waiting for results queue to drain 28011 1726882573.38296: waiting for pending results... 28011 1726882573.38813: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 28011 1726882573.39099: in run() - task 12673a56-9f93-962d-7c65-0000000000f0 28011 1726882573.39103: variable 'ansible_search_path' from source: unknown 28011 1726882573.39115: calling self._execute() 28011 1726882573.39216: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.39334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.39352: variable 'omit' from source: magic vars 28011 1726882573.39840: variable 'ansible_distribution_major_version' from source: facts 28011 1726882573.39858: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882573.39870: _execute() done 28011 1726882573.39879: dumping result to json 28011 1726882573.39890: done dumping result, returning 28011 1726882573.39909: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [12673a56-9f93-962d-7c65-0000000000f0] 28011 1726882573.40018: sending task result for task 12673a56-9f93-962d-7c65-0000000000f0 28011 1726882573.40088: done sending task result for task 12673a56-9f93-962d-7c65-0000000000f0 28011 1726882573.40092: WORKER PROCESS EXITING 28011 1726882573.40147: no more pending results, returning what we have 28011 1726882573.40152: in VariableManager get_vars() 28011 1726882573.40190: Calling all_inventory to load vars for managed_node1 28011 1726882573.40192: Calling groups_inventory to load vars for managed_node1 28011 1726882573.40198: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882573.40212: Calling all_plugins_play to load vars for managed_node1 28011 1726882573.40216: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882573.40218: Calling groups_plugins_play to load vars for managed_node1 28011 1726882573.41826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882573.43529: done with get_vars() 28011 1726882573.43554: variable 'ansible_search_path' from source: unknown 28011 1726882573.43567: we have included files to process 28011 1726882573.43568: generating all_blocks data 28011 1726882573.43570: done generating all_blocks data 28011 1726882573.43575: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28011 1726882573.43576: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28011 1726882573.43579: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28011 1726882573.43744: in VariableManager get_vars() 28011 1726882573.43770: done with get_vars() 28011 1726882573.43883: done processing included file 28011 1726882573.43885: iterating over new_blocks loaded from include file 28011 1726882573.43889: in VariableManager get_vars() 28011 1726882573.43900: done with get_vars() 28011 1726882573.43902: filtering new block on tags 28011 1726882573.43918: done filtering new block on tags 28011 1726882573.43920: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 28011 1726882573.43925: extending task lists for all hosts with included blocks 28011 1726882573.44083: done extending task lists 28011 1726882573.44085: done processing included files 28011 1726882573.44088: results queue empty 28011 1726882573.44088: checking for any_errors_fatal 28011 1726882573.44092: done checking for any_errors_fatal 28011 1726882573.44094: checking for max_fail_percentage 28011 1726882573.44095: done checking for max_fail_percentage 28011 1726882573.44096: checking to see if all hosts have failed and the running result is not ok 28011 1726882573.44096: done checking to see if all hosts have failed 28011 1726882573.44097: getting the remaining hosts for this loop 28011 1726882573.44098: done getting the remaining hosts for this loop 28011 1726882573.44101: getting the next task for host managed_node1 28011 1726882573.44104: done getting next task for host managed_node1 28011 1726882573.44106: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28011 1726882573.44108: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882573.44111: getting variables 28011 1726882573.44112: in VariableManager get_vars() 28011 1726882573.44119: Calling all_inventory to load vars for managed_node1 28011 1726882573.44121: Calling groups_inventory to load vars for managed_node1 28011 1726882573.44123: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882573.44128: Calling all_plugins_play to load vars for managed_node1 28011 1726882573.44131: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882573.44133: Calling groups_plugins_play to load vars for managed_node1 28011 1726882573.45703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882573.48841: done with get_vars() 28011 1726882573.48864: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:36:13 -0400 (0:00:00.113) 0:00:43.042 ****** 28011 1726882573.49151: entering _queue_task() for managed_node1/include_tasks 28011 1726882573.49701: worker is 1 (out of 1 available) 28011 1726882573.49712: exiting _queue_task() for managed_node1/include_tasks 28011 1726882573.49725: done queuing things up, now waiting for results queue to drain 28011 1726882573.49727: waiting for pending results... 28011 1726882573.50411: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 28011 1726882573.50420: in run() - task 12673a56-9f93-962d-7c65-0000000008b5 28011 1726882573.50424: variable 'ansible_search_path' from source: unknown 28011 1726882573.50426: variable 'ansible_search_path' from source: unknown 28011 1726882573.50429: calling self._execute() 28011 1726882573.50798: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.50802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.50805: variable 'omit' from source: magic vars 28011 1726882573.51450: variable 'ansible_distribution_major_version' from source: facts 28011 1726882573.51466: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882573.51476: _execute() done 28011 1726882573.51483: dumping result to json 28011 1726882573.51490: done dumping result, returning 28011 1726882573.51503: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-962d-7c65-0000000008b5] 28011 1726882573.51512: sending task result for task 12673a56-9f93-962d-7c65-0000000008b5 28011 1726882573.51627: no more pending results, returning what we have 28011 1726882573.51633: in VariableManager get_vars() 28011 1726882573.51666: Calling all_inventory to load vars for managed_node1 28011 1726882573.51668: Calling groups_inventory to load vars for managed_node1 28011 1726882573.51671: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882573.51684: Calling all_plugins_play to load vars for managed_node1 28011 1726882573.51691: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882573.51696: Calling groups_plugins_play to load vars for managed_node1 28011 1726882573.52601: done sending task result for task 12673a56-9f93-962d-7c65-0000000008b5 28011 1726882573.52605: WORKER PROCESS EXITING 28011 1726882573.54316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882573.65300: done with get_vars() 28011 1726882573.65322: variable 'ansible_search_path' from source: unknown 28011 1726882573.65324: variable 'ansible_search_path' from source: unknown 28011 1726882573.65364: we have included files to process 28011 1726882573.65365: generating all_blocks data 28011 1726882573.65366: done generating all_blocks data 28011 1726882573.65367: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28011 1726882573.65368: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28011 1726882573.65370: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28011 1726882573.65541: done processing included file 28011 1726882573.65544: iterating over new_blocks loaded from include file 28011 1726882573.65545: in VariableManager get_vars() 28011 1726882573.65558: done with get_vars() 28011 1726882573.65560: filtering new block on tags 28011 1726882573.65579: done filtering new block on tags 28011 1726882573.65582: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 28011 1726882573.65588: extending task lists for all hosts with included blocks 28011 1726882573.65697: done extending task lists 28011 1726882573.65698: done processing included files 28011 1726882573.65699: results queue empty 28011 1726882573.65700: checking for any_errors_fatal 28011 1726882573.65702: done checking for any_errors_fatal 28011 1726882573.65703: checking for max_fail_percentage 28011 1726882573.65703: done checking for max_fail_percentage 28011 1726882573.65704: checking to see if all hosts have failed and the running result is not ok 28011 1726882573.65705: done checking to see if all hosts have failed 28011 1726882573.65706: getting the remaining hosts for this loop 28011 1726882573.65707: done getting the remaining hosts for this loop 28011 1726882573.65709: getting the next task for host managed_node1 28011 1726882573.65713: done getting next task for host managed_node1 28011 1726882573.65715: ^ task is: TASK: Get stat for interface {{ interface }} 28011 1726882573.65717: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882573.65720: getting variables 28011 1726882573.65721: in VariableManager get_vars() 28011 1726882573.65729: Calling all_inventory to load vars for managed_node1 28011 1726882573.65731: Calling groups_inventory to load vars for managed_node1 28011 1726882573.65734: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882573.65739: Calling all_plugins_play to load vars for managed_node1 28011 1726882573.65741: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882573.65744: Calling groups_plugins_play to load vars for managed_node1 28011 1726882573.66952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882573.69409: done with get_vars() 28011 1726882573.69437: done getting variables 28011 1726882573.69795: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:36:13 -0400 (0:00:00.206) 0:00:43.249 ****** 28011 1726882573.69824: entering _queue_task() for managed_node1/stat 28011 1726882573.70590: worker is 1 (out of 1 available) 28011 1726882573.70604: exiting _queue_task() for managed_node1/stat 28011 1726882573.70617: done queuing things up, now waiting for results queue to drain 28011 1726882573.70620: waiting for pending results... 28011 1726882573.71311: running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 28011 1726882573.71317: in run() - task 12673a56-9f93-962d-7c65-0000000008cf 28011 1726882573.71699: variable 'ansible_search_path' from source: unknown 28011 1726882573.71702: variable 'ansible_search_path' from source: unknown 28011 1726882573.71710: calling self._execute() 28011 1726882573.71713: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.71716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.71718: variable 'omit' from source: magic vars 28011 1726882573.72417: variable 'ansible_distribution_major_version' from source: facts 28011 1726882573.72698: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882573.72701: variable 'omit' from source: magic vars 28011 1726882573.72704: variable 'omit' from source: magic vars 28011 1726882573.72771: variable 'interface' from source: set_fact 28011 1726882573.72800: variable 'omit' from source: magic vars 28011 1726882573.73098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882573.73102: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882573.73104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882573.73122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882573.73139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882573.73175: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882573.73184: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.73194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.73297: Set connection var ansible_connection to ssh 28011 1726882573.73509: Set connection var ansible_pipelining to False 28011 1726882573.73521: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882573.73531: Set connection var ansible_shell_executable to /bin/sh 28011 1726882573.73543: Set connection var ansible_timeout to 10 28011 1726882573.73551: Set connection var ansible_shell_type to sh 28011 1726882573.73580: variable 'ansible_shell_executable' from source: unknown 28011 1726882573.73588: variable 'ansible_connection' from source: unknown 28011 1726882573.73598: variable 'ansible_module_compression' from source: unknown 28011 1726882573.73606: variable 'ansible_shell_type' from source: unknown 28011 1726882573.73613: variable 'ansible_shell_executable' from source: unknown 28011 1726882573.73621: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882573.73629: variable 'ansible_pipelining' from source: unknown 28011 1726882573.73636: variable 'ansible_timeout' from source: unknown 28011 1726882573.73644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882573.74039: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28011 1726882573.74055: variable 'omit' from source: magic vars 28011 1726882573.74064: starting attempt loop 28011 1726882573.74070: running the handler 28011 1726882573.74089: _low_level_execute_command(): starting 28011 1726882573.74398: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882573.75572: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882573.75590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882573.75710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882573.75810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882573.75907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882573.77532: stdout chunk (state=3): >>>/root <<< 28011 1726882573.77634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882573.77671: stderr chunk (state=3): >>><<< 28011 1726882573.77680: stdout chunk (state=3): >>><<< 28011 1726882573.77714: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882573.77879: _low_level_execute_command(): starting 28011 1726882573.77882: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085 `" && echo ansible-tmp-1726882573.7780125-29958-130066404978085="` echo /root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085 `" ) && sleep 0' 28011 1726882573.79008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882573.79012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882573.79212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882573.79226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882573.79409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882573.81232: stdout chunk (state=3): >>>ansible-tmp-1726882573.7780125-29958-130066404978085=/root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085 <<< 28011 1726882573.81384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882573.81478: stderr chunk (state=3): >>><<< 28011 1726882573.81487: stdout chunk (state=3): >>><<< 28011 1726882573.81798: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882573.7780125-29958-130066404978085=/root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882573.81802: variable 'ansible_module_compression' from source: unknown 28011 1726882573.81804: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28011 1726882573.81806: variable 'ansible_facts' from source: unknown 28011 1726882573.82043: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085/AnsiballZ_stat.py 28011 1726882573.82315: Sending initial data 28011 1726882573.82324: Sent initial data (153 bytes) 28011 1726882573.83492: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882573.83733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882573.83772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882573.85288: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28011 1726882573.85303: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882573.85408: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085/AnsiballZ_stat.py" <<< 28011 1726882573.85424: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpzlt71mil /root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085/AnsiballZ_stat.py <<< 28011 1726882573.85438: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpzlt71mil" to remote "/root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085/AnsiballZ_stat.py" <<< 28011 1726882573.87167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882573.87181: stderr chunk (state=3): >>><<< 28011 1726882573.87190: stdout chunk (state=3): >>><<< 28011 1726882573.87255: done transferring module to remote 28011 1726882573.87480: _low_level_execute_command(): starting 28011 1726882573.87484: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085/ /root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085/AnsiballZ_stat.py && sleep 0' 28011 1726882573.88502: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882573.88517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882573.88608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882573.88821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882573.88836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882573.88902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882573.90659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882573.90669: stdout chunk (state=3): >>><<< 28011 1726882573.90681: stderr chunk (state=3): >>><<< 28011 1726882573.90707: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882573.90716: _low_level_execute_command(): starting 28011 1726882573.90730: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085/AnsiballZ_stat.py && sleep 0' 28011 1726882573.92034: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882573.92044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882573.92081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882573.92100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882573.92113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 28011 1726882573.92246: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882573.92329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882573.92356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882573.92368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882573.92500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882574.07552: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28011 1726882574.08696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882574.08714: stdout chunk (state=3): >>><<< 28011 1726882574.08730: stderr chunk (state=3): >>><<< 28011 1726882574.08760: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882574.08797: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882574.08815: _low_level_execute_command(): starting 28011 1726882574.08825: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882573.7780125-29958-130066404978085/ > /dev/null 2>&1 && sleep 0' 28011 1726882574.09470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882574.09489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882574.09517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882574.09626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882574.09652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882574.09722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882574.11549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882574.11553: stdout chunk (state=3): >>><<< 28011 1726882574.11559: stderr chunk (state=3): >>><<< 28011 1726882574.11575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882574.11583: handler run complete 28011 1726882574.11605: attempt loop complete, returning result 28011 1726882574.11609: _execute() done 28011 1726882574.11611: dumping result to json 28011 1726882574.11798: done dumping result, returning 28011 1726882574.11801: done running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 [12673a56-9f93-962d-7c65-0000000008cf] 28011 1726882574.11803: sending task result for task 12673a56-9f93-962d-7c65-0000000008cf 28011 1726882574.11865: done sending task result for task 12673a56-9f93-962d-7c65-0000000008cf 28011 1726882574.11868: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 28011 1726882574.11933: no more pending results, returning what we have 28011 1726882574.11938: results queue empty 28011 1726882574.11939: checking for any_errors_fatal 28011 1726882574.11941: done checking for any_errors_fatal 28011 1726882574.11942: checking for max_fail_percentage 28011 1726882574.11943: done checking for max_fail_percentage 28011 1726882574.11945: checking to see if all hosts have failed and the running result is not ok 28011 1726882574.11945: done checking to see if all hosts have failed 28011 1726882574.11946: getting the remaining hosts for this loop 28011 1726882574.11948: done getting the remaining hosts for this loop 28011 1726882574.11952: getting the next task for host managed_node1 28011 1726882574.11960: done getting next task for host managed_node1 28011 1726882574.11963: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 28011 1726882574.11966: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882574.11970: getting variables 28011 1726882574.11972: in VariableManager get_vars() 28011 1726882574.12006: Calling all_inventory to load vars for managed_node1 28011 1726882574.12009: Calling groups_inventory to load vars for managed_node1 28011 1726882574.12013: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882574.12025: Calling all_plugins_play to load vars for managed_node1 28011 1726882574.12028: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882574.12031: Calling groups_plugins_play to load vars for managed_node1 28011 1726882574.14669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882574.16763: done with get_vars() 28011 1726882574.16834: done getting variables 28011 1726882574.16945: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28011 1726882574.17320: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:36:14 -0400 (0:00:00.475) 0:00:43.724 ****** 28011 1726882574.17351: entering _queue_task() for managed_node1/assert 28011 1726882574.17861: worker is 1 (out of 1 available) 28011 1726882574.17874: exiting _queue_task() for managed_node1/assert 28011 1726882574.17889: done queuing things up, now waiting for results queue to drain 28011 1726882574.17890: waiting for pending results... 28011 1726882574.18432: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'ethtest0' 28011 1726882574.18664: in run() - task 12673a56-9f93-962d-7c65-0000000008b6 28011 1726882574.18677: variable 'ansible_search_path' from source: unknown 28011 1726882574.18680: variable 'ansible_search_path' from source: unknown 28011 1726882574.18719: calling self._execute() 28011 1726882574.18824: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882574.18830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882574.18889: variable 'omit' from source: magic vars 28011 1726882574.19629: variable 'ansible_distribution_major_version' from source: facts 28011 1726882574.19762: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882574.19767: variable 'omit' from source: magic vars 28011 1726882574.19900: variable 'omit' from source: magic vars 28011 1726882574.20016: variable 'interface' from source: set_fact 28011 1726882574.20035: variable 'omit' from source: magic vars 28011 1726882574.20075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882574.20258: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882574.20311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882574.20334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882574.20345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882574.20377: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882574.20381: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882574.20392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882574.20524: Set connection var ansible_connection to ssh 28011 1726882574.20533: Set connection var ansible_pipelining to False 28011 1726882574.20539: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882574.20545: Set connection var ansible_shell_executable to /bin/sh 28011 1726882574.20553: Set connection var ansible_timeout to 10 28011 1726882574.20558: Set connection var ansible_shell_type to sh 28011 1726882574.20584: variable 'ansible_shell_executable' from source: unknown 28011 1726882574.20588: variable 'ansible_connection' from source: unknown 28011 1726882574.20698: variable 'ansible_module_compression' from source: unknown 28011 1726882574.20702: variable 'ansible_shell_type' from source: unknown 28011 1726882574.20704: variable 'ansible_shell_executable' from source: unknown 28011 1726882574.20706: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882574.20708: variable 'ansible_pipelining' from source: unknown 28011 1726882574.20711: variable 'ansible_timeout' from source: unknown 28011 1726882574.20713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882574.20776: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882574.20786: variable 'omit' from source: magic vars 28011 1726882574.20795: starting attempt loop 28011 1726882574.20798: running the handler 28011 1726882574.20966: variable 'interface_stat' from source: set_fact 28011 1726882574.20975: Evaluated conditional (not interface_stat.stat.exists): True 28011 1726882574.20980: handler run complete 28011 1726882574.21000: attempt loop complete, returning result 28011 1726882574.21003: _execute() done 28011 1726882574.21006: dumping result to json 28011 1726882574.21010: done dumping result, returning 28011 1726882574.21013: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'ethtest0' [12673a56-9f93-962d-7c65-0000000008b6] 28011 1726882574.21019: sending task result for task 12673a56-9f93-962d-7c65-0000000008b6 28011 1726882574.21114: done sending task result for task 12673a56-9f93-962d-7c65-0000000008b6 28011 1726882574.21117: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 28011 1726882574.21209: no more pending results, returning what we have 28011 1726882574.21214: results queue empty 28011 1726882574.21215: checking for any_errors_fatal 28011 1726882574.21224: done checking for any_errors_fatal 28011 1726882574.21225: checking for max_fail_percentage 28011 1726882574.21227: done checking for max_fail_percentage 28011 1726882574.21228: checking to see if all hosts have failed and the running result is not ok 28011 1726882574.21229: done checking to see if all hosts have failed 28011 1726882574.21229: getting the remaining hosts for this loop 28011 1726882574.21231: done getting the remaining hosts for this loop 28011 1726882574.21235: getting the next task for host managed_node1 28011 1726882574.21243: done getting next task for host managed_node1 28011 1726882574.21252: ^ task is: TASK: Verify network state restored to default 28011 1726882574.21254: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882574.21259: getting variables 28011 1726882574.21261: in VariableManager get_vars() 28011 1726882574.21300: Calling all_inventory to load vars for managed_node1 28011 1726882574.21303: Calling groups_inventory to load vars for managed_node1 28011 1726882574.21307: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882574.21319: Calling all_plugins_play to load vars for managed_node1 28011 1726882574.21322: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882574.21326: Calling groups_plugins_play to load vars for managed_node1 28011 1726882574.23904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882574.25639: done with get_vars() 28011 1726882574.25660: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:158 Friday 20 September 2024 21:36:14 -0400 (0:00:00.084) 0:00:43.808 ****** 28011 1726882574.25753: entering _queue_task() for managed_node1/include_tasks 28011 1726882574.26076: worker is 1 (out of 1 available) 28011 1726882574.26091: exiting _queue_task() for managed_node1/include_tasks 28011 1726882574.26105: done queuing things up, now waiting for results queue to drain 28011 1726882574.26107: waiting for pending results... 28011 1726882574.26510: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 28011 1726882574.26515: in run() - task 12673a56-9f93-962d-7c65-0000000000f1 28011 1726882574.26519: variable 'ansible_search_path' from source: unknown 28011 1726882574.26522: calling self._execute() 28011 1726882574.26588: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882574.26602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882574.26619: variable 'omit' from source: magic vars 28011 1726882574.26971: variable 'ansible_distribution_major_version' from source: facts 28011 1726882574.26989: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882574.27004: _execute() done 28011 1726882574.27013: dumping result to json 28011 1726882574.27022: done dumping result, returning 28011 1726882574.27033: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [12673a56-9f93-962d-7c65-0000000000f1] 28011 1726882574.27043: sending task result for task 12673a56-9f93-962d-7c65-0000000000f1 28011 1726882574.27204: no more pending results, returning what we have 28011 1726882574.27209: in VariableManager get_vars() 28011 1726882574.27240: Calling all_inventory to load vars for managed_node1 28011 1726882574.27242: Calling groups_inventory to load vars for managed_node1 28011 1726882574.27246: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882574.27260: Calling all_plugins_play to load vars for managed_node1 28011 1726882574.27263: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882574.27266: Calling groups_plugins_play to load vars for managed_node1 28011 1726882574.27787: done sending task result for task 12673a56-9f93-962d-7c65-0000000000f1 28011 1726882574.27790: WORKER PROCESS EXITING 28011 1726882574.28530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882574.29388: done with get_vars() 28011 1726882574.29403: variable 'ansible_search_path' from source: unknown 28011 1726882574.29412: we have included files to process 28011 1726882574.29413: generating all_blocks data 28011 1726882574.29414: done generating all_blocks data 28011 1726882574.29417: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 28011 1726882574.29417: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 28011 1726882574.29419: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 28011 1726882574.29668: done processing included file 28011 1726882574.29670: iterating over new_blocks loaded from include file 28011 1726882574.29671: in VariableManager get_vars() 28011 1726882574.29678: done with get_vars() 28011 1726882574.29679: filtering new block on tags 28011 1726882574.29689: done filtering new block on tags 28011 1726882574.29691: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 28011 1726882574.29695: extending task lists for all hosts with included blocks 28011 1726882574.29836: done extending task lists 28011 1726882574.29837: done processing included files 28011 1726882574.29837: results queue empty 28011 1726882574.29838: checking for any_errors_fatal 28011 1726882574.29840: done checking for any_errors_fatal 28011 1726882574.29840: checking for max_fail_percentage 28011 1726882574.29841: done checking for max_fail_percentage 28011 1726882574.29841: checking to see if all hosts have failed and the running result is not ok 28011 1726882574.29842: done checking to see if all hosts have failed 28011 1726882574.29842: getting the remaining hosts for this loop 28011 1726882574.29843: done getting the remaining hosts for this loop 28011 1726882574.29845: getting the next task for host managed_node1 28011 1726882574.29847: done getting next task for host managed_node1 28011 1726882574.29848: ^ task is: TASK: Check routes and DNS 28011 1726882574.29849: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882574.29851: getting variables 28011 1726882574.29851: in VariableManager get_vars() 28011 1726882574.29857: Calling all_inventory to load vars for managed_node1 28011 1726882574.29858: Calling groups_inventory to load vars for managed_node1 28011 1726882574.29859: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882574.29863: Calling all_plugins_play to load vars for managed_node1 28011 1726882574.29864: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882574.29866: Calling groups_plugins_play to load vars for managed_node1 28011 1726882574.30916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882574.31899: done with get_vars() 28011 1726882574.31913: done getting variables 28011 1726882574.31939: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:36:14 -0400 (0:00:00.062) 0:00:43.870 ****** 28011 1726882574.31958: entering _queue_task() for managed_node1/shell 28011 1726882574.32169: worker is 1 (out of 1 available) 28011 1726882574.32182: exiting _queue_task() for managed_node1/shell 28011 1726882574.32194: done queuing things up, now waiting for results queue to drain 28011 1726882574.32196: waiting for pending results... 28011 1726882574.32364: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 28011 1726882574.32431: in run() - task 12673a56-9f93-962d-7c65-0000000008e7 28011 1726882574.32445: variable 'ansible_search_path' from source: unknown 28011 1726882574.32449: variable 'ansible_search_path' from source: unknown 28011 1726882574.32474: calling self._execute() 28011 1726882574.32544: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882574.32549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882574.32562: variable 'omit' from source: magic vars 28011 1726882574.32837: variable 'ansible_distribution_major_version' from source: facts 28011 1726882574.32846: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882574.32854: variable 'omit' from source: magic vars 28011 1726882574.32885: variable 'omit' from source: magic vars 28011 1726882574.32909: variable 'omit' from source: magic vars 28011 1726882574.32939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882574.32964: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882574.32980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882574.32998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882574.33008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882574.33032: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882574.33035: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882574.33037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882574.33109: Set connection var ansible_connection to ssh 28011 1726882574.33116: Set connection var ansible_pipelining to False 28011 1726882574.33121: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882574.33126: Set connection var ansible_shell_executable to /bin/sh 28011 1726882574.33133: Set connection var ansible_timeout to 10 28011 1726882574.33138: Set connection var ansible_shell_type to sh 28011 1726882574.33155: variable 'ansible_shell_executable' from source: unknown 28011 1726882574.33158: variable 'ansible_connection' from source: unknown 28011 1726882574.33161: variable 'ansible_module_compression' from source: unknown 28011 1726882574.33164: variable 'ansible_shell_type' from source: unknown 28011 1726882574.33166: variable 'ansible_shell_executable' from source: unknown 28011 1726882574.33168: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882574.33170: variable 'ansible_pipelining' from source: unknown 28011 1726882574.33173: variable 'ansible_timeout' from source: unknown 28011 1726882574.33177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882574.33275: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882574.33283: variable 'omit' from source: magic vars 28011 1726882574.33290: starting attempt loop 28011 1726882574.33295: running the handler 28011 1726882574.33304: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882574.33322: _low_level_execute_command(): starting 28011 1726882574.33328: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882574.33785: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882574.33814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.33817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882574.33820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.33877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882574.33882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882574.33885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882574.33924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882574.35577: stdout chunk (state=3): >>>/root <<< 28011 1726882574.35696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882574.35700: stdout chunk (state=3): >>><<< 28011 1726882574.35702: stderr chunk (state=3): >>><<< 28011 1726882574.35722: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882574.35799: _low_level_execute_command(): starting 28011 1726882574.35804: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509 `" && echo ansible-tmp-1726882574.3573318-29985-174318584592509="` echo /root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509 `" ) && sleep 0' 28011 1726882574.36235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882574.36246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882574.36267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882574.36270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.36324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882574.36327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882574.36374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882574.38226: stdout chunk (state=3): >>>ansible-tmp-1726882574.3573318-29985-174318584592509=/root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509 <<< 28011 1726882574.38364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882574.38399: stderr chunk (state=3): >>><<< 28011 1726882574.38402: stdout chunk (state=3): >>><<< 28011 1726882574.38421: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882574.3573318-29985-174318584592509=/root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882574.38603: variable 'ansible_module_compression' from source: unknown 28011 1726882574.38607: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28011 1726882574.38609: variable 'ansible_facts' from source: unknown 28011 1726882574.38633: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509/AnsiballZ_command.py 28011 1726882574.38739: Sending initial data 28011 1726882574.38742: Sent initial data (156 bytes) 28011 1726882574.39329: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.39388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882574.39406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882574.39459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882574.39508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882574.41016: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28011 1726882574.41023: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882574.41057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882574.41098: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpea3sdbl_ /root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509/AnsiballZ_command.py <<< 28011 1726882574.41103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509/AnsiballZ_command.py" <<< 28011 1726882574.41141: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmpea3sdbl_" to remote "/root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509/AnsiballZ_command.py" <<< 28011 1726882574.41143: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509/AnsiballZ_command.py" <<< 28011 1726882574.41754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882574.41896: stderr chunk (state=3): >>><<< 28011 1726882574.41899: stdout chunk (state=3): >>><<< 28011 1726882574.41902: done transferring module to remote 28011 1726882574.41904: _low_level_execute_command(): starting 28011 1726882574.41906: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509/ /root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509/AnsiballZ_command.py && sleep 0' 28011 1726882574.42396: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882574.42403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.42425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882574.42432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.42484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882574.42491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882574.42531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882574.44607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882574.44625: stderr chunk (state=3): >>><<< 28011 1726882574.44628: stdout chunk (state=3): >>><<< 28011 1726882574.44645: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882574.44648: _low_level_execute_command(): starting 28011 1726882574.44652: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509/AnsiballZ_command.py && sleep 0' 28011 1726882574.45297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882574.45301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.45377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882574.45390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882574.45446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882574.61175: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:30:0b:a1:42:23 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2611sec preferred_lft 2611sec\n inet6 fe80::1030:bff:fea1:4223/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:36:14.602529", "end": "2024-09-20 21:36:14.610599", "delta": "0:00:00.008070", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882574.62544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882574.62590: stderr chunk (state=3): >>><<< 28011 1726882574.62596: stdout chunk (state=3): >>><<< 28011 1726882574.62621: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:30:0b:a1:42:23 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2611sec preferred_lft 2611sec\n inet6 fe80::1030:bff:fea1:4223/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:36:14.602529", "end": "2024-09-20 21:36:14.610599", "delta": "0:00:00.008070", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882574.62669: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882574.62676: _low_level_execute_command(): starting 28011 1726882574.62682: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882574.3573318-29985-174318584592509/ > /dev/null 2>&1 && sleep 0' 28011 1726882574.63197: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28011 1726882574.63200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.63203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 28011 1726882574.63205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 28011 1726882574.63207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.63255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882574.63262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882574.63264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882574.63305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882574.65064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882574.65099: stderr chunk (state=3): >>><<< 28011 1726882574.65102: stdout chunk (state=3): >>><<< 28011 1726882574.65111: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882574.65117: handler run complete 28011 1726882574.65140: Evaluated conditional (False): False 28011 1726882574.65156: attempt loop complete, returning result 28011 1726882574.65160: _execute() done 28011 1726882574.65162: dumping result to json 28011 1726882574.65180: done dumping result, returning 28011 1726882574.65182: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [12673a56-9f93-962d-7c65-0000000008e7] 28011 1726882574.65184: sending task result for task 12673a56-9f93-962d-7c65-0000000008e7 28011 1726882574.65304: done sending task result for task 12673a56-9f93-962d-7c65-0000000008e7 28011 1726882574.65307: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008070", "end": "2024-09-20 21:36:14.610599", "rc": 0, "start": "2024-09-20 21:36:14.602529" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:30:0b:a1:42:23 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2611sec preferred_lft 2611sec inet6 fe80::1030:bff:fea1:4223/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 28011 1726882574.65372: no more pending results, returning what we have 28011 1726882574.65375: results queue empty 28011 1726882574.65376: checking for any_errors_fatal 28011 1726882574.65378: done checking for any_errors_fatal 28011 1726882574.65378: checking for max_fail_percentage 28011 1726882574.65380: done checking for max_fail_percentage 28011 1726882574.65381: checking to see if all hosts have failed and the running result is not ok 28011 1726882574.65381: done checking to see if all hosts have failed 28011 1726882574.65382: getting the remaining hosts for this loop 28011 1726882574.65383: done getting the remaining hosts for this loop 28011 1726882574.65387: getting the next task for host managed_node1 28011 1726882574.65420: done getting next task for host managed_node1 28011 1726882574.65424: ^ task is: TASK: Verify DNS and network connectivity 28011 1726882574.65426: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882574.65430: getting variables 28011 1726882574.65432: in VariableManager get_vars() 28011 1726882574.65463: Calling all_inventory to load vars for managed_node1 28011 1726882574.65465: Calling groups_inventory to load vars for managed_node1 28011 1726882574.65468: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882574.65479: Calling all_plugins_play to load vars for managed_node1 28011 1726882574.65482: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882574.65484: Calling groups_plugins_play to load vars for managed_node1 28011 1726882574.66673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882574.67755: done with get_vars() 28011 1726882574.67778: done getting variables 28011 1726882574.67829: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:36:14 -0400 (0:00:00.358) 0:00:44.229 ****** 28011 1726882574.67851: entering _queue_task() for managed_node1/shell 28011 1726882574.68076: worker is 1 (out of 1 available) 28011 1726882574.68088: exiting _queue_task() for managed_node1/shell 28011 1726882574.68101: done queuing things up, now waiting for results queue to drain 28011 1726882574.68102: waiting for pending results... 28011 1726882574.68269: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 28011 1726882574.68354: in run() - task 12673a56-9f93-962d-7c65-0000000008e8 28011 1726882574.68365: variable 'ansible_search_path' from source: unknown 28011 1726882574.68369: variable 'ansible_search_path' from source: unknown 28011 1726882574.68401: calling self._execute() 28011 1726882574.68472: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882574.68476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882574.68486: variable 'omit' from source: magic vars 28011 1726882574.68761: variable 'ansible_distribution_major_version' from source: facts 28011 1726882574.68772: Evaluated conditional (ansible_distribution_major_version != '6'): True 28011 1726882574.68897: variable 'ansible_facts' from source: unknown 28011 1726882574.69427: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 28011 1726882574.69431: variable 'omit' from source: magic vars 28011 1726882574.69459: variable 'omit' from source: magic vars 28011 1726882574.69480: variable 'omit' from source: magic vars 28011 1726882574.69514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28011 1726882574.69546: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28011 1726882574.69603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28011 1726882574.69617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882574.69629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28011 1726882574.69666: variable 'inventory_hostname' from source: host vars for 'managed_node1' 28011 1726882574.69669: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882574.69672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882574.69765: Set connection var ansible_connection to ssh 28011 1726882574.69773: Set connection var ansible_pipelining to False 28011 1726882574.69778: Set connection var ansible_module_compression to ZIP_DEFLATED 28011 1726882574.69783: Set connection var ansible_shell_executable to /bin/sh 28011 1726882574.69792: Set connection var ansible_timeout to 10 28011 1726882574.69812: Set connection var ansible_shell_type to sh 28011 1726882574.69830: variable 'ansible_shell_executable' from source: unknown 28011 1726882574.69833: variable 'ansible_connection' from source: unknown 28011 1726882574.69836: variable 'ansible_module_compression' from source: unknown 28011 1726882574.69838: variable 'ansible_shell_type' from source: unknown 28011 1726882574.69841: variable 'ansible_shell_executable' from source: unknown 28011 1726882574.69843: variable 'ansible_host' from source: host vars for 'managed_node1' 28011 1726882574.69846: variable 'ansible_pipelining' from source: unknown 28011 1726882574.69848: variable 'ansible_timeout' from source: unknown 28011 1726882574.69853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 28011 1726882574.69962: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882574.69972: variable 'omit' from source: magic vars 28011 1726882574.69975: starting attempt loop 28011 1726882574.69978: running the handler 28011 1726882574.69990: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28011 1726882574.70009: _low_level_execute_command(): starting 28011 1726882574.70017: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28011 1726882574.70658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.70664: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.70715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882574.70730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882574.70789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882574.72330: stdout chunk (state=3): >>>/root <<< 28011 1726882574.72443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882574.72472: stderr chunk (state=3): >>><<< 28011 1726882574.72476: stdout chunk (state=3): >>><<< 28011 1726882574.72507: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882574.72526: _low_level_execute_command(): starting 28011 1726882574.72530: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577 `" && echo ansible-tmp-1726882574.7250488-29997-51521818920577="` echo /root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577 `" ) && sleep 0' 28011 1726882574.73011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882574.73066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.73069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.73131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882574.73134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882574.73163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882574.75006: stdout chunk (state=3): >>>ansible-tmp-1726882574.7250488-29997-51521818920577=/root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577 <<< 28011 1726882574.75115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882574.75137: stderr chunk (state=3): >>><<< 28011 1726882574.75140: stdout chunk (state=3): >>><<< 28011 1726882574.75162: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882574.7250488-29997-51521818920577=/root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882574.75180: variable 'ansible_module_compression' from source: unknown 28011 1726882574.75222: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-280111j7y_g2o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28011 1726882574.75253: variable 'ansible_facts' from source: unknown 28011 1726882574.75308: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577/AnsiballZ_command.py 28011 1726882574.75401: Sending initial data 28011 1726882574.75404: Sent initial data (155 bytes) 28011 1726882574.75830: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882574.75833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.75836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882574.75838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.75881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882574.75884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882574.75934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882574.77436: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28011 1726882574.77442: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28011 1726882574.77473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28011 1726882574.77516: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp7bh27hk8 /root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577/AnsiballZ_command.py <<< 28011 1726882574.77519: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577/AnsiballZ_command.py" <<< 28011 1726882574.77558: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-280111j7y_g2o/tmp7bh27hk8" to remote "/root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577/AnsiballZ_command.py" <<< 28011 1726882574.78072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882574.78109: stderr chunk (state=3): >>><<< 28011 1726882574.78113: stdout chunk (state=3): >>><<< 28011 1726882574.78132: done transferring module to remote 28011 1726882574.78139: _low_level_execute_command(): starting 28011 1726882574.78144: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577/ /root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577/AnsiballZ_command.py && sleep 0' 28011 1726882574.78551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882574.78555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882574.78557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28011 1726882574.78559: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882574.78564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.78599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882574.78616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882574.78654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882574.80354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882574.80376: stderr chunk (state=3): >>><<< 28011 1726882574.80379: stdout chunk (state=3): >>><<< 28011 1726882574.80392: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882574.80397: _low_level_execute_command(): starting 28011 1726882574.80400: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577/AnsiballZ_command.py && sleep 0' 28011 1726882574.80766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882574.80795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28011 1726882574.80799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 28011 1726882574.80801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.80804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28011 1726882574.80806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882574.80855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882574.80858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882574.80911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882575.26190: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1492 0 --:--:-- --:--:-- --:--:-- 1487\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3748 0 --:--:-- --:--:-- --:--:-- 3779", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:36:14.956483", "end": "2024-09-20 21:36:15.260692", "delta": "0:00:00.304209", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28011 1726882575.27889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 28011 1726882575.27896: stdout chunk (state=3): >>><<< 28011 1726882575.27899: stderr chunk (state=3): >>><<< 28011 1726882575.28052: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1492 0 --:--:-- --:--:-- --:--:-- 1487\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3748 0 --:--:-- --:--:-- --:--:-- 3779", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:36:14.956483", "end": "2024-09-20 21:36:15.260692", "delta": "0:00:00.304209", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 28011 1726882575.28063: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28011 1726882575.28066: _low_level_execute_command(): starting 28011 1726882575.28068: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882574.7250488-29997-51521818920577/ > /dev/null 2>&1 && sleep 0' 28011 1726882575.28629: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28011 1726882575.28710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28011 1726882575.28753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 28011 1726882575.28770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28011 1726882575.28791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28011 1726882575.28858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28011 1726882575.30901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28011 1726882575.30906: stdout chunk (state=3): >>><<< 28011 1726882575.30907: stderr chunk (state=3): >>><<< 28011 1726882575.30909: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28011 1726882575.30911: handler run complete 28011 1726882575.30913: Evaluated conditional (False): False 28011 1726882575.30915: attempt loop complete, returning result 28011 1726882575.30916: _execute() done 28011 1726882575.30918: dumping result to json 28011 1726882575.30919: done dumping result, returning 28011 1726882575.30921: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [12673a56-9f93-962d-7c65-0000000008e8] 28011 1726882575.30922: sending task result for task 12673a56-9f93-962d-7c65-0000000008e8 28011 1726882575.30996: done sending task result for task 12673a56-9f93-962d-7c65-0000000008e8 ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.304209", "end": "2024-09-20 21:36:15.260692", "rc": 0, "start": "2024-09-20 21:36:14.956483" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1492 0 --:--:-- --:--:-- --:--:-- 1487 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 3748 0 --:--:-- --:--:-- --:--:-- 3779 28011 1726882575.31069: no more pending results, returning what we have 28011 1726882575.31073: results queue empty 28011 1726882575.31073: checking for any_errors_fatal 28011 1726882575.31084: done checking for any_errors_fatal 28011 1726882575.31085: checking for max_fail_percentage 28011 1726882575.31087: done checking for max_fail_percentage 28011 1726882575.31088: checking to see if all hosts have failed and the running result is not ok 28011 1726882575.31089: done checking to see if all hosts have failed 28011 1726882575.31089: getting the remaining hosts for this loop 28011 1726882575.31090: done getting the remaining hosts for this loop 28011 1726882575.31097: getting the next task for host managed_node1 28011 1726882575.31105: done getting next task for host managed_node1 28011 1726882575.31112: WORKER PROCESS EXITING 28011 1726882575.31300: ^ task is: TASK: meta (flush_handlers) 28011 1726882575.31312: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882575.31317: getting variables 28011 1726882575.31318: in VariableManager get_vars() 28011 1726882575.31345: Calling all_inventory to load vars for managed_node1 28011 1726882575.31348: Calling groups_inventory to load vars for managed_node1 28011 1726882575.31354: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882575.31364: Calling all_plugins_play to load vars for managed_node1 28011 1726882575.31366: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882575.31369: Calling groups_plugins_play to load vars for managed_node1 28011 1726882575.32643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882575.33836: done with get_vars() 28011 1726882575.33852: done getting variables 28011 1726882575.33904: in VariableManager get_vars() 28011 1726882575.33912: Calling all_inventory to load vars for managed_node1 28011 1726882575.33914: Calling groups_inventory to load vars for managed_node1 28011 1726882575.33916: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882575.33919: Calling all_plugins_play to load vars for managed_node1 28011 1726882575.33921: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882575.33922: Calling groups_plugins_play to load vars for managed_node1 28011 1726882575.34576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882575.35936: done with get_vars() 28011 1726882575.35960: done queuing things up, now waiting for results queue to drain 28011 1726882575.35961: results queue empty 28011 1726882575.35962: checking for any_errors_fatal 28011 1726882575.35964: done checking for any_errors_fatal 28011 1726882575.35965: checking for max_fail_percentage 28011 1726882575.35966: done checking for max_fail_percentage 28011 1726882575.35966: checking to see if all hosts have failed and the running result is not ok 28011 1726882575.35966: done checking to see if all hosts have failed 28011 1726882575.35967: getting the remaining hosts for this loop 28011 1726882575.35968: done getting the remaining hosts for this loop 28011 1726882575.35970: getting the next task for host managed_node1 28011 1726882575.35972: done getting next task for host managed_node1 28011 1726882575.35973: ^ task is: TASK: meta (flush_handlers) 28011 1726882575.35974: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882575.35976: getting variables 28011 1726882575.35977: in VariableManager get_vars() 28011 1726882575.35982: Calling all_inventory to load vars for managed_node1 28011 1726882575.35984: Calling groups_inventory to load vars for managed_node1 28011 1726882575.35985: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882575.35990: Calling all_plugins_play to load vars for managed_node1 28011 1726882575.35994: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882575.35997: Calling groups_plugins_play to load vars for managed_node1 28011 1726882575.36720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882575.37576: done with get_vars() 28011 1726882575.37591: done getting variables 28011 1726882575.37631: in VariableManager get_vars() 28011 1726882575.37637: Calling all_inventory to load vars for managed_node1 28011 1726882575.37639: Calling groups_inventory to load vars for managed_node1 28011 1726882575.37640: Calling all_plugins_inventory to load vars for managed_node1 28011 1726882575.37643: Calling all_plugins_play to load vars for managed_node1 28011 1726882575.37645: Calling groups_plugins_inventory to load vars for managed_node1 28011 1726882575.37646: Calling groups_plugins_play to load vars for managed_node1 28011 1726882575.38503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28011 1726882575.39752: done with get_vars() 28011 1726882575.39780: done queuing things up, now waiting for results queue to drain 28011 1726882575.39781: results queue empty 28011 1726882575.39782: checking for any_errors_fatal 28011 1726882575.39783: done checking for any_errors_fatal 28011 1726882575.39783: checking for max_fail_percentage 28011 1726882575.39784: done checking for max_fail_percentage 28011 1726882575.39784: checking to see if all hosts have failed and the running result is not ok 28011 1726882575.39785: done checking to see if all hosts have failed 28011 1726882575.39785: getting the remaining hosts for this loop 28011 1726882575.39786: done getting the remaining hosts for this loop 28011 1726882575.39788: getting the next task for host managed_node1 28011 1726882575.39795: done getting next task for host managed_node1 28011 1726882575.39796: ^ task is: None 28011 1726882575.39797: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28011 1726882575.39800: done queuing things up, now waiting for results queue to drain 28011 1726882575.39802: results queue empty 28011 1726882575.39802: checking for any_errors_fatal 28011 1726882575.39803: done checking for any_errors_fatal 28011 1726882575.39804: checking for max_fail_percentage 28011 1726882575.39805: done checking for max_fail_percentage 28011 1726882575.39806: checking to see if all hosts have failed and the running result is not ok 28011 1726882575.39806: done checking to see if all hosts have failed 28011 1726882575.39808: getting the next task for host managed_node1 28011 1726882575.39811: done getting next task for host managed_node1 28011 1726882575.39811: ^ task is: None 28011 1726882575.39813: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=90 changed=6 unreachable=0 failed=0 skipped=91 rescued=0 ignored=1 Friday 20 September 2024 21:36:15 -0400 (0:00:00.720) 0:00:44.950 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.93s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.78s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.74s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.27s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.22s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Create veth interface ethtest0 ------------------------------------------ 1.05s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.00s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.97s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.94s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:149 Gathering Facts --------------------------------------------------------- 0.92s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.84s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` ---------- 0.78s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:135 Install iproute --------------------------------------------------------- 0.76s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 0.75s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.75s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.73s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 28011 1726882575.39916: RUNNING CLEANUP